Getting Lean and Going Green: Innovations in Warehouse Operations May 01, By William Atkinson No tags available When warehouses and DCs explore innovative ways to optimize flow and reduce waste, the benefits can be bottom-line friendly and eco-friendly, too.
Reading List Key Takeaways There are many decisions and tradeoffs that must be made when moving from batch ETL to stream data processing. Engineers should not "stream all the things" just because stream processing technology is popular The Netflix case study presented here migrated to Apache Flink.
Aroraa senior data engineer at Netflix, began by stating that the key goal of the presentation was to help the audience decide if a stream-processing data pipeline would help resolve problems they may be experiencing with a traditional extract-transform-load ETL batch processing job.
In addition to this, she discussed core decisions and tradeoffs that must be made when moving from batch to streaming. The Netflix system uses the microservice architectural style and services communicate via remote procedure call RPC and messaging. At a high level, microservice application instances emit user and system-driven data events that are collected within the Netflix Keystone data pipeline — a petabyte-scale real-time event streaming-processing system for business and product analytics.
Batch-processed data is stored within tables or indexers like Elasticsearch for consumption by the research team, downstream systems, or dashboard applications. There are clear business wins for using stream processing, including the opportunity to train machine-learning algorithms with the latest data, provide innovation in the marketing of new launches, and create opportunities for new kinds of machine-learning algorithms.
There are also technical wins, such as the ability to save on storage costs as raw data does not need to be stored in its original formfaster turnaround time on error correction long-running batch jobs can incur significant delays when they failreal-time auditing on key personalization metrics, and integration with other real-time systems.
A core challenge when implementing stream processing is picking an appropriate engine.
The first key question to ask is will the data be processed as an event-based stream or in micro-batches. If results are simply required sooner than currently provided, and the organization has already invested heavily in batch, then migrating to micro-batching could be the most appropriate and cost-effective solution.
The next challenge in picking a stream-processing engine is to ask what features will be most important in order to solve the problem being tackled. This will most likely not be an issue that is solved in an initial brainstorming session — often a deep understanding of the problem and data only emerge after an in-depth investigation.
Each engine supports this feature to varying degrees with varying mechanisms. Another question to ask is whether the implementation requires the lambda architecture. This architecture is not to be confused with AWS Lambda or serverless technology in general — in the data-processing domain, the lambda architecture is designed to handle massive quantities of data by taking advantage of both batch-processing and stream-processing methods.
It may be the case that an existing batch job simply needs to be augmented with a speed layer, and if this is the case then choosing a data-processing engine that supports both layers of the lambda architecture may facilitate code reuse.
Several additional questions to ask when choosing a stream-processing engine include: What are other teams using within your organization? If there is a significant investment in a specific technology, then existing implementation and operational knowledge can often be leveraged.
What is the landscape of the existing ETL systems within your organization? Will a new technology easily fit in with existing sources and sinks? What are your requirements for learning curve? What engines do you use for batch processing, and what are the most widely adopted programming languages?The history of lean operations Bernardine I.
Felecia BSBA OM-3 TOYOTA CASE STUDY Statement of the Problem: Toyota’s brand image of creating reliable and efficient is damaged due to accelerator pedals getting caught on floor mats.
The Gold Mine Trilogy Study Guide and book set were designed to support group learning by book clubs and teams..
The guide and set together lay out a natural path for you and your teams to follow to improve lean thinking and practice together as you progress through the trilogy of The Gold Mine, The Lean Manager, and, most recently, . Real World Comparisons. Future Roofing – Their Way. Condenser water piping run above the roof under the towers will make re-roofing costly and difficult.
Using Exploratory Data Analysis to Improve the Fresh Foods Ordering Process in Retail Stores. This case study presents a real-world example of how the thought processes of data scientists can contribute to quality practice.
Lean Management Case Studies.
Marchwinski, Chet. 5/16/ Lean Management Examples from a Variety of Businesses. Logistics saw an opportunity to leapfrog the competition by embracing lean in its outsourced warehousing and receiving operations.
Lean . Lean Healthcare Payer Operations Process Improvement Case Study Case Studies Health Care Hospital Finance Operations Transformation End-to-end .