About rashikparmar

Rashik is the leader of IBM’s European technical community and an IBM Distinguished Engineer and previously President of IBM's Academy of Technology. He currently advising clients on their cloud innovation roadmap from a technology and business perspective drawing on thirty-three years of practical experience in IBM. He has worked for financial, retail and manufacturing clients on IT transformation projects of all sizes. Overall, he specialises in ensuring the technical success of complex IT projects that transform business models. Some of his work is summarised in the HBR Article he co-authored “New Patterns for Innovation” – Jan-Feb 2014. Rashik was appointed to the Leeds City Region Local Enterprise Partnership Board and chairs the business communications group. He is helping develop the innovation strategies to accelerate growth of this £55bn marketplace. Rashik is also IBM’s Partnership Executive for Imperial College – London. He is also an Adjunct Professor for Department of Innovation and Entrepreneurship at the Imperial College Business School and Visiting Professor to the Intelligent Systems and Networks Group at the Department of Electrical & Electronic Engineering. Rashik was awarded an MBE (Member of the Order of the British Empire) as part of the HRH Queen Elizabeth’s 90 birthday honours for his contribution to business and innovation.

Are digital platforms essential for growth?

The discussions at the Academy of Management annual meeting symposium – “A Multi-Disciplinary Perspective on Platform Ecosystems Research” made a strong case for platforms as a primary model to understand business models. Industry Platforms and Ecosystem Innovation also makes a strong case for the central role of platforms in evolution of a firm.

At the firm level using the 5 patterns described in “The New Patterns of Innovation” provide a simple way to identify innovation ideas and explore opportunities for growth. Considering these ideas as candidate platforms allows the full market potential of the idea to be explored. In addition, the challenges in creating the ecosystem of complementors, design of the digital platform, impacts to the current organisation and financial case can be created.

Adding the structure of Component Business Modelling  can provide an analytical framework to substantiate the financial case and develop the change programme.

These tools all seem to point to the digital platform as being the primary value creation tool for established business and there is clear need for unpacking this idea into a practical toolbox for managers.

Is there a business case for the next generation of digitisation?

As I reflect on the emerging 3 laws of digitisation, I have been asked many times if there is an overall “business case” for the shift in labour or GDP.  The way I think about this is to start with some very simple high level metrics.

Global GDP is 75 Trillion approximately. (World Bank)

The high-level industry breakdown extrapolating the UK data:

Agriculture 2%
Manufacturing 23%
Services 75%

Digitisation in the form of the pattern 1 : augmented product pattern  will allow the manufacturing industries to grow into the service industries and capture some of this GDP. If they are wildly successful, they could increase the GDP by 20% over a 20 year period and grow to 28% and take share from the service industries. This represents a $3.5tr shift in GDP as a direct result of the augmented product pattern.

For the service industries digitisation will lead to pattern 5: codify services. If we consider the service industries as consisting of three broad groups – creative, routine and personal. Creative includes graphic designers, engineers etc. Personal includes hair dressers, local retailers etc. Routine includes those with a significant element of routine work that is highly likely to be digitised using combinations of Cognitive technologies such as AI or Machine Learning. If we make the assumption that these represent an even distribution of the service industries i.e. 25% each of the 75% GDP. The impact on the 25% of the routine service industry is the question we need to explore further to understand the business case.

If the manufacturing industries are successful in their augmented industry pattern, we can see the routine service industry being reduced by 5%. The remaining 20% will be subject to further digitisation. A working assumption could be that 50% of this will be displaced in the 20-50 year period and open a new industry for services digitisation. If we assumed creating the platform for digitisation will be the key to success, and there will be an exponential adoption, we could argue 25% of the transformation will happen in the first 20 years and so a further reduction of 5% is possible.

The conclusion is that the digitisation industries have the opportunity to represent 10% of the global GDP over a 20 year period. Which based on current GDP value is $7.5tr.

 

Use data and human intelligence in the right combination..

I just listened to Sebastian Wernicke’s Ted Talk: “How to use data to make a hit TV show”. The phrase Sebastian used which was “the data alone will not give you the future answer you need you use the intelligence from the brain as well”.

Often in discussions I have around the data savvy culture and implication of this evolution, people become fearful of their own role – thinking that these “data driven robots will replace everything they do and they will no longer be valuable”.  This is far from the reality. The human ability to ask the right questions, sense the weak signals and understand the implications are two unique skills that are seemingly impossible to replace at this stage.

What I took away from this video was how important it is for all us to develop and hone our question asking skills and start to learn how to best use the wide range to data analysis tools at our disposal. I would recommend reading “A more beautiful question” as this is the best book I found to develop your question asking skills. In addition, stopping yourself from doing the same thing in the same way is another approach I found valuable.

What are your thoughts on the balance between data driven and human intelligence/instinct driven?

 

 

Is there a set of Fundamental Data Types?

Is there a set of Fundamental Data Types?
Numerals and number systems have served us well as aids to quantify and represent the information around us, but today they are just one small part of the vast and complex digital shadow that is cast over our daily lives. Statistical methods help us explain how this shadow has evolved and can amazingly help predict its near-term future, its adjacent possibles. But why does that materially matter to us? So what?
Every piece of data that has ever been captured or created, had, has or will have a purpose, even if that purpose is not always immediately obvious. For instance it might help guide or trigger an action that will add value in the real world. That value might improve someone’s life, advance a product or process, or just turn a traffic light from red to green. The data associated with these acts is a catalyst to that value creation or is a characteristic of its outcome.
The term ‘catalyst’ is chosen to draw an analogy to chemistry and demonstrate how effective the idea of reacting data can be, when considered within the context of a fundamental data framework and is akin to the several branches of mathematics based on the same principle, using it to constrain how primitive number types work. A foundation of this framework, like with chemistry, has to be the elemental nature of data. That is its propensity to form in a fixed number of ways, or elemental types. These can also be thought of as exhibiting elemental properties. For instance, some data types must be highly volatile, while others must tend towards being inert, some must be heavy, while others lightweight and so on. Thinking like this not only simplifies the way that we perceive data, but also leads us to consider the ways in which these elemental data types might interact, or react, to make the world a better place. 
As for the fundamental data types themselves, we know of a few already, but there are many more out there to be labelled properly. Certainly we can add data associated with location and time to the list and we commonly refer to these as being geospatial and event data. But what about other types, such as personal, pricing, indexing data, and so on? Or perhaps more excitingly, the more exotic types of data, like metadata, analytical, aggregated and inferred data? The answers are nowhere near being clear yet, but certainly playing with data in this way is leading us to understand that our journey is only just beginning. For all we think we know about data, the real lessons are still to come. But wouldn’t it be fascinating if we had something like a periodic table of data, a touchstone we could turn to to help us experiment with data in safe, but as yet unimagined, ways…!
Co-authored with PhilTetlow:  https://uk.linkedin.com/in/philtetlow @DocPhilT

Are we nearing the end of human ingenuity?

After reading David Gann and Mark Dodson’s excellent article titled “There will be much less work in the future. We need to rethink our societies”, I feel myself pausing. Are we too entwined with the what we know and using this data to define the future? There is no doubt that there will be less of the current work in the future. However, I never cease to be amazed by the human ingenuity to create new meaningful employment.

Who would have guessed that digital games industry would exist in the form it does today?

The scale of wastage and inefficiency in all the systems that support our daily lives will new solutions to be created and will create new employment in ways we cannot imagine today. So I think we need to explore the opportunity for new employment created by the digital economy and help guide the evolution of these system to create new meaningful employment.

Soapbox mode off. Now time to think about how to create a better future.

Are there some fundamental laws for digitisation?

As we have learned in through the development of other sciences, fundamental laws emerge which help us make sense of evolution. Digitisation many would argue is still in it’s infancy, but there may be sufficient examples to start to identify some candidate fundamental laws. In the HBR Paper “New Patterns for Innovation”, we identified the 5 atomic business model patterns that result from applying digitisation. They continue to serve well in developing the innovation strategy within a business. However, are there more fundamental laws at play here?

The first candidate law seems to be that data has gravity. Technopedia has a good summary as does this McGrory’s blog. So the amount of data you have is important, and how you use this continually attract new data and increase the gravitational effect. The risk of course is that having too little data, will result in the data being subsumed by a larger pull of data. The effort required to ensure that this data remains distinct, relevant, and current can be determined.

The second candidate law is what can be digitised, will be digitised. Digitisation increase the understanding of the thing being digitised. It guides the improvement and optimisation. In some cases digitisation replaces the physical object. Think of the bus ticket. 40 years ago you would see conductors on the bus taking a passengers money in return for a printed paper ticket. Digitisation led to the creation of ticket machines that provided machine readable cards, then to contact-less travel cards and now apps on phones.

This leads us nicely to the third and final candidate law – what ever is digitised, tends to become free. The wired article Free! Why $0.00 is the future business model provides a compelling case. In his book “The Curve”, Nicholas Lovell explains how to make money when everything is going free. Daniel Pink in “A Whole Mind” discusses the 3 A’s of Automation, Abundance and Aisa that is driving the need for whole mind thinking. The underlying narrative is how individuals need to find ways to remain valuable when digitisation in the form of automation is driving out costs.

As a Digital Leader, these three laws can help guide you in the development of your strategies and plans to:

  1. Use digitisation to disrupt established industries
  2. Identify the risks of digitisation and develop mitigating plans
  3. Transform your business to remain relevant and valuable.

What are the generic data platform business models?

Over the last 3 months, there has been a common set of questions about the design of the underlying business model for a data platform. If you have a data set that you think is valuable, how should you “sell” it?

  1. Data as a product.
    This is the simplest model and in essence you sell all or part of that data for a defined value. This is a one off payment as for a product and the purchaser has unrestricted use of that data. This is typically useful for static data e..g clinical trials, survey results, insights from analytics.
  2. Subscription model
    The purchaser is able to access defined portions of the data, through an API. Restrictions are often applied on quality of data, frequency of access, usage etc.
  3. Analysis or Model as a Service
    Here the data provider will create a model where the purchaser is able to add their own data to create unique insights. As an example, retail sales data gathers for areas of a city can be provided as a model to allow real estate companies assess the value of developing a shopping mall at a particular location.
  4. Insight as a Service
    Consultants or experts are made available by the data platform provider to provide insights or answers to specific questions required by the purchaser. This is essentially a consulting service, where the consultants are data scientists that have access to unique data stores and analysis tools. Benchmarking services are a good example here.

These seem to be the four generic models I have come across so far. What other models have you come across?