Blogs

Why COVID-19 data is an important part of data analytics?

Data insights using publicly available, localized COVID-19 data mashup with internal data is the key in effectively managing business operations during the COVID-19 pandemic.  

Organizations always have data to assist them in managing their operations and in forecasting sales and revenues. In today’s environment, external CVOID-19 data combined with internal data has become a necessary tool for organizations’ leaders in understanding and addressing three key areas including

Business impact – Scale of impact

Response – Operations continuity

Path forward – Agile operations

Organizations, especially smaller ones, have better chances of surviving and thriving in this economy if they have access to some real-time quantitative analytics.

Even with the availability of public data on COVID-19, I am finding organizations struggling to have simple dashboards for quick data insights. Recently, I started assisting my clients to develop a simple format of assimilated data dashboards using COVID-19 data overlaying their internal data. They were able to quickly view their impacted market segment using relevant “What-if scenarios and trends”. The turnaround time was quick and impactful. I am seeing great results with customers who have taken this approach of having quick analytics.

How to deliver on data initiatives in a competitive talent market?

In recent years, the term data science has become more popular due to the influx of data in all businesses. Data science is about getting valuable insights and answering questions by analyzing data using statistical methods, computing power, and automation. When a business is looking to answer a data-driven question, they must follow a set of predefined steps, known as the data science process, and know what these steps involve.

The process of data science includes more than one role. These roles within this process includes business analysts, data engineers, data scientists, and developers. Even though there can be some overlap, each of these roles is important and plays a vital part in the process. The business analyst provides the business understanding to guide the project, The data engineer prepares the data for use by the data scientist in model training,  The data scientist must understand the data to train and test the model. The developer is responsible for model deployment and operationalizing.

DataScienceProcess

These days organizations are finding it hard to retain talent for their data science processes. Fueled by big data and AI, demand for data science skills is growing exponentially, according to job sites. The supply of skilled applicants, however, is growing at a slower pace. According to a  KMPG CIO Survey, taken by over 3,600 technology leaders at companies across the U.S., showed that 46% of chief information officers see “big data and analytics” as the area most suffering from a shortage in the nation’s job market. One way to address this shortage is by partnering with vendor(s) who offer data science services. This approach is important to provide in house data science teams resources including industry knowledge, skills and experience to deliver great data products for data-driven decision making. Most of the vendors offer these services on project basis, This is a great approach to accelerate data work in large organizations, but this approach is hard to sustain for long period of time due to cost especially for small to midsize companies. This can cause the data initiatives to slow down or not get delivered. The model which I found to be more effective for long period of time especially for small to medium size businesses is the DSaaS (Data Science As A Services) model, where the client has access to the entire data science team on a monthly subscription basis. This model can keep the cost down and take away the headaches which goes along retaining a large data science team. Another reason I like this approach because it is aligned with the agile philosophy of delivery which has higher rate of success than the traditional waterfall approach.  There are few firms that are offering data strategy and engineering services in this format like datatelligent.ai that delivers customized analytics and AI solutions.

What is Data Lake Storage?

The Data Lake feature allows you to perform analytics on your data usage and prepare reports. Data Lake is a large repository that stores both structured and unstructured data. Data Lake Storage combines the scalability and cost benefits of object storage with the reliability and performance of the Big Data file system capabilities. The following illustration shows how Azure Data Lake stores all your business data and makes it available for analysis.

3-data_lake_store_concept.png

Cloud cost management starts by applying the compute services that are optimized for a specific use cases.

The goal of cloud computing is to make running a business easier and more efficient, whether it’s a small start-up or a large enterprise. Every business is unique and has different needs. To meet those needs, cloud computing providers offer a wide range of services. Cloud compute services including Virtual Machines, Containers, App Service and Serverless computing offer application development and deployment approaches if applied correctly can save time and money. Each service provides benefits as well as tradeoffs against other options. IT needs to have a good understanding of these compute services.

Virtual Machines (VM) is an emulation of a physical computer, which offers more control that comes with maintenance overhead.

Containers provide a consistent, isolated execution environment for applications. They are similar to VMs except they don’t require a guest operating system. Instead, the application and all its dependencies is packaged into a “container” and then a standard runtime environment is used to execute the app. This allows the container to start up in just a few seconds, because there’s no OS to boot and initialize. You only need the app to launch.

Serverless computing lets you run application code without creating, configuring, or maintaining a server. Each approach is optimized for specific use case. The core idea is that your application is broken into separate functions that run when triggered by some action. This is ideal for automated tasks.

2-vm-vs-container-vs-serverless

Insights on SaaS (Software as a Service) environment data are critical for operational efficiencies and risks reduction

The prefoliation of SaaS (Software as a Service) has made the delivery of technology for the business easier, faster and cheaper. SaaS is now a common system of record for organizations. This change has revolutionized the modern workplace and changed the traditional way of managing and securing the IT services for the organization. This shift has brought a completely new paradigm for IT teams on how to manage, secure and support this new landscape. 

Organizations must understand exactly how SaaS applications operate and interact with each other. That includes understanding information that needs to be centralized and discovered and build insights on the data that is relevant to increase operational efficiencies. In order to reduce security risks and increase compliance, organizations must introduce automation where possible,  and applying analytics on operational data to avoid alert fatigue. 

A comprehensive data strategy including centralization, discoverability, insights, action, automation, delegation and auditability is needed to fill the gaps introduced by today’s SaaS environments and to gain the level of control and clarity that is essential for properly securing the corporate environment.

SaaSData   

Analytics in Healthcare – Sources – Analytics – Applications

Healthcareecosystem

Roadmap to data insights

Blog_V2_11202019

Role of an Enterprise Architect is critical for the success of cloud migration and optimization of the cloud on ongoing basis. Here are few reasons why?

The enterprise architect (EA) can play a key leadership role in cloud migration. The goal of any EA is to ensure that the highest business value is received for most efficient use of technology resources; as such, EA provides the essential bridge between business and IT. Typically, EA maintains the list of IT capabilities and processes, facilitates the creation and implementation of IT strategies, works with businesses and executives to understand the long-term goals of the company in order to plan for the future, and drives various enterprise-wide governance activities, such as architecture review. For such reasons, the EA is an ideal choice to lead the Cloud Strategy Team.

The EA overseeing the IT ecosystem as a whole is in a position to provide the appropriate analyses of system capabilities and application impacts of any large-scale changes to the ecosystem. Often, it is EA that creates and maintains the portfolio management system (the catalog of applications) from which the prioritization of applications to be moved to the cloud can be drawn. Enterprise architect should examine what is known about the portfolio and where additional information is needed—for example, whether an application is virtualized—the EA team should add this and other attributes to the knowledge base and engage with other parts of IT to collect the data.

Cloud migration offers the enterprise architect many opportunities. By using modeling techniques such as business capability analysis and capability maturity models, it is usually possible, as the prioritization process for applications takes place, to simplify IT by consolidating applications of similar function. Consolidation will have clear financial benefits both by reducing the compute, data, and network requirements, as well as by simplifying the operations and maintenance functions.

The enterprise architect can use the opportunity afforded by cloud migration to analyze the data models used by applications and update them to enterprise-wide canonical models. Such an effort will streamline application integration and reduce semantic mismatches between disparate data models, which often requires manual adjustment in a complex on-premises environment.

In addition, it is the EA core responsibility to create and maintain as-is and to-be roadmaps of the overall IT ecosystem. The EA should easily be able to communicate the various stages of the migration, summarizing the current thinking of the Cloud Strategy Team.

Finally, the EA should direct the investigation into the use of new cloud technologies to either augment existing capabilities or provide entirely new functionality to IT applications, and as these are validated, to add these to the existing roadmaps. Enterprise architects need to experiment with new technologies as well as understand and communicate their business value to IT management and business stakeholders. Successful investigations should lead to the development and publishing of reference architectures that applications teams can reuse.