How can Data/Analytics and AI/ML help with better business decision making? How can we unlock the potential in your data?
Given a relatively volatile set of macroeconomic, socio-political and geographical (eg supply chain) business uncertainties on the one hand and the promise of technologies for and new possibilities, how can we capitalize on these potential opportunities.
Opportunities lie in the latent space between uncertainty and certainty where data remains locked with the potential to be unleashed into innovative new solutions how do you think the upcoming technologies, like AI/ML and Data/Analytics can help us simplify the complex organizational decision-making process?
Decisions require context
Richer context: the more certain contextually enriched data we assemble, that stitches together disparate, apparently unrelated data items, with increasingly stronger probability, the more probable the conclusions we are able to draw. And show a vector sum of where the data is actually pointing.
Data finds data and can build a web [read: graph] of deeper and deeper contextually relevant relationships can provide increasingly richer datasets that build a case for higher and higher probability of unearthing a trend and following it out of latent space into reality. In this way, the degree of certainty increases and uncertainty [entropy] tends to decrease and the distant shadow of definitive direction starts to shape out more clearly into a decision we can be more firmly confident of…
But how do we get there? Here are some suggested steps.
- Get the data house in order. Yes, this is a non-trivial process.
- Leverage data cloud for building a DataLake for Realtime streaming (e.g., Cloud Pub/Sub into Google BigQuery) , batch processing of data into a data lake, like BigLake.
- Clean, curate and label data for AI/ML training
- Publish in a feature store, like Vertex AI Feature Store. This can become the governed gold standard for multiple use-cases across the organization to be used and reused by multiple teams, starting from a known lineage.
- Have an end to end pipeline that automates the data pipeline for the data lake, and ones that produce feature store outputs for subsequent ML training
- Gain insights into your data through exploring visualization and deeper analysis, such as with Looker (look at the semantic connections layer) and Data Studio (with direct integration into BigQuery).
- Use the formal construct of an Experiment in Vertex AI to try out various algorithms, optimization techniques, while keeping track of which experiment started from which data and used which optimizations and algorithms.
- Evaluate the models produced, explore explainability in AI
- Publish the models in a Model Registry like Vertex AI Model Registry.
- Choose the versions of the models you wish to advocate and go through the governance process, perhaps CI/CD to push into production
- Use a pipeline to deliver the models from the Vertex AI Model Registry and after Meta data exploration into a production endpoint
- Monitor the endpoint, using Vertex Monitor
- Serve using a Managed endpoint, GKE cluster or GCE depending on your organizational standards , but automate the process using Vertex AI Pipelines
- As you monitor, re-trigger the data pipeline, the training pipeline and the deployment into production pipelines with various governance processes implemented through Vertex AI Pipelines
What innovations can these technologies unlock?
What new business models are possible?
What new possibilities open in terms of innovative business strategies?
What are your thoughts? I’ll comment on the above in my next blog entry.