In Eric Ries's book The Lean Startup, Eric provides a scientific approach to get a desired product into the customer's hands by focusing on value and eliminating waste. In part, this approach has accelerated the post industrial thinking about software code. It’s soft and malleable. See illustration below for Eric’s method.
For instance, if you take a project like Smart Fleet Management and tackle one of the key component’s of EHM (equipment health management); real-time engine monitoring can be tested and put into small building blocks or phases of data in motion and data ingested. Making changes throughout the development process to ensure that it 'works' before you buy the 'whole lot' can be achieved by applying Eric’s method without jeopardizing the success of the project. Highly simplified for this post, the Lean approach in building a MVP (minimal viable product) viable enough for something that people can actually use; can be measured in real-time for continuous deployment for an enterprise production application. The Lean method differs greatly to the traditional ‘top down’ industrial approach that most companies are use to today.
When building a MVP, open-source software components based on modern data architecture will make it possible to achieve a fast evolving learning loop that focuses on value and eliminates waste. Data at Rest and Data in Motion are the two key stakeholders in this process to make it successful.
For the industrial sector, Data at Rest and in Motion will host the traditional technical infrastructure such as SCADA, DCS, PLC, Vibration, Data Historians, Equipment Sensor Data, Fuel, ACARS, APU and etc. On the Data at Rest side, the open-source component of MapReduce; the worldwide data processing cost saving platform to mange large-scale data sets invented by Google for instance can leverage the installed industrial technical infrastructure as an enterprise application that is in production today in many industrial companies. An important component before you start a MVP project is to identify where you sit on the technical and capability maturity curve to achieve your digital optimization goals. A FCOM (Frictional Cost Optimization Model) can be applied to review the operational technology or physics based data, your IT infrastructure along with your Industrial Data Science capabilities so that no redundancies are overlooked for cost savings and barriers to success are identified with a strategy in place for a successful project.
Eric's approach along with the powerful combination of open-source components with the installed base of an industrial infrastructure can have a sizable impact on lowering traditional commercial software license costs. High volume data growth costs while sacrificing scalability are barriers to entry when a project like Smart Fleet Management is being architected for Oil & Gas Production Optimization, Up-time for Power Generation units or Aircraft Fleet Optimization.