Feb 14, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Correlation-Causation  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> The Future of Big Data [Infographics] by v1shal

>> Specificity is the Soul of Data Narrative by analyticsweek

>> @ChuckRehberg / @TrigentSoftware on Translating Technology to Solve Business Problems #FutureOfData by v1shal

Wanna write? Click Here

[ NEWS BYTES]

>>
 JMMMMC Recognized For Overall Excellence In Quality Among Rural Hospitals – Sand Hills Express Under  Health Analytics

>>
 Webinar: Cloud security – Five questions to help decide who to trust – The Lawyer Under  Cloud Security

>>
 Global Streaming Analytics Market 2018 Share, Trend, Segmentation and Forecast to 2023 – Finance Exchange Under  Streaming Analytics

More NEWS ? Click Here

[ FEATURED COURSE]

Statistical Thinking and Data Analysis

image

This course is an introduction to statistical data analysis. Topics are chosen from applied probability, sampling, estimation, hypothesis testing, linear regression, analysis of variance, categorical data analysis, and n… more

[ FEATURED READ]

Antifragile: Things That Gain from Disorder

image

Antifragile is a standalone book in Nassim Nicholas Taleb’s landmark Incerto series, an investigation of opacity, luck, uncertainty, probability, human error, risk, and decision-making in a world we don’t understand. The… more

[ TIPS & TRICKS OF THE WEEK]

Winter is coming, warm your Analytics Club
Yes and yes! As we are heading into winter what better way but to talk about our increasing dependence on data analytics to help with our decision making. Data and analytics driven decision making is rapidly sneaking its way into our core corporate DNA and we are not churning practice ground to test those models fast enough. Such snugly looking models have hidden nails which could induce unchartered pain if go unchecked. This is the right time to start thinking about putting Analytics Club[Data Analytics CoE] in your work place to help Lab out the best practices and provide test environment for those models.

[ DATA SCIENCE Q&A]

Q:Do you know / used data reduction techniques other than PCA? What do you think of step-wise regression? What kind of step-wise techniques are you familiar with?
A: data reduction techniques other than PCA?:
Partial least squares: like PCR (principal component regression) but chooses the principal components in a supervised way. Gives higher weights to variables that are most strongly related to the response

step-wise regression?
– the choice of predictive variables are carried out using a systematic procedure
– Usually, it takes the form of a sequence of F-tests, t-tests, adjusted R-squared, AIC, BIC
– at any given step, the model is fit using unconstrained least squares
– can get stuck in local optima
– Better: Lasso

step-wise techniques:
– Forward-selection: begin with no variables, adding them when they improve a chosen model comparison criterion
– Backward-selection: begin with all the variables, removing them when it improves a chosen model comparison criterion

Better than reduced data:
Example 1: If all the components have a high variance: which components to discard with a guarantee that there will be no significant loss of the information?
Example 2 (classification):
– One has 2 classes; the within class variance is very high as compared to between class variance
– PCA might discard the very information that separates the two classes

Better than a sample:
– When number of variables is high relative to the number of observations

Source

[ VIDEO OF THE WEEK]

@AnalyticsWeek: Big Data Health Informatics for the 21st Century: Gil Alterovitz

 @AnalyticsWeek: Big Data Health Informatics for the 21st Century: Gil Alterovitz

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data is the new science. Big Data holds the answers. – Pat Gelsinger

[ PODCAST OF THE WEEK]

Andrea Gallego(@risenthink) / @BCG on Managing Analytics Practice #FutureOfData #Podcast

 Andrea Gallego(@risenthink) / @BCG on Managing Analytics Practice #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

571 new websites are created every minute of the day.

Sourced from: Analytics.CLUB #WEB Newsletter

Six Practices Critical to Creating Value from Data and Analytics [INFOGRAPHIC]

IBM Institute for Business Value surveyed 900 IT and business executives from 70 countries from June through August 2013. The 50+ survey questions were designed to help translate concepts relating to generating value from analytics into actions. They found that business leaders adopt specific strategies to create value from data and analytics. Leaders:

  1. are 166% more likely to make decisions based on data.
  2. are 2.2 times more likely to have a formal career path for analytics.
  3. cite growth as the key source of value.
  4. measure the impact of analytics investments.
  5. have predictive analytics capabilities.
  6. have some form of shared analytics resources.

Read my summary of IBM’s study here. Download the entire study here. And check out IBM’s infographic below.

IBM Institute for Business Value - 2013 Infographic

 

Source by bobehayes

6 things that you should know about vMwarevSphere 6.5

vSphere 6.5 offers a resilient, highly available, on-demand infrastructure that is the perfect groundwork for any cloud environment. It provides innovation that will assist digital transformation for the business and make the job of the IT administrator simpler. This means that most of their time will be freed up so that they can carry out more innovations instead of maintaining the status quo. Furthermore, vSpehere is the foundation of the hybrid cloud strategy of VMware and is necessary for cross-cloud architectures. Here are essential features of the new and updated vSphere.

vCenter Server appliance

vCenter is an essential backend tool that controls the virtual infrastructure of VMware. vCenter 6.5 has lots of innovative upgraded features. It has a migration tool that aids in shifting from vSphere 5.5 or 6.0 to vSphere 6.5. The vCenter Server appliance also includes the VMware Update Manager that eliminates the need for restarting external VM tasks or using pesky plugins.

vSphere client

In the past, the front-end client that was used for accessing the vCenter Server was quite old-fashioned and stocky. The vSphere has undergone necessary HTML5 alterations. Aside from the foreseeable performance upgrades, the change also makes this tool cross-browser compatible and more mobile-friendly.  The plugins are no longer needed and the UI has been switched for a more cutting-edge aesthetics founded on the VMware Clarity UI.

Backup and restore

The backup and restore capabilities of the VSpher 6.5 is an excellent functionality that enables clients to back up data on any Platform Services Controller appliances or the vCenter Server directly from the Application Programming Interface(API) or Virtual Appliance Management Interface (VAMI). In addition, it is able to back up both VUM and Auto Deploy implanted within the appliance. This backup mainly consists of files that need to be streamed into a preferred storage device through SCP, FTP(s), or HTTP(s) protocols.

Superior automation capabilities

With regards to automation, VMware vSphere 6.5 works perfectly because of the new upgrades. The new PowerCLI tweak has been an excellent addition to the VMware part because it is completely module-based and the APIs are at present in very high demand. This feature enables the IT administrators to entirely computerize tasks down to the virtual machine level.

 Secure boot

The secure boot element of vSphere comprises the -enabled virtual machines. This feature is available in both Linux and Windows VMs and it allows secure boot to be completed through the clicking of a simplified checkbox situated in the VM properties. After it is enabled, only the properly signed VMs can utilize the virtual environment for booting.

 Improved auditing

The Vsphere 6.5 offers clients improved audit-quality logging characteristics. This aids in accessing more forensic details about user actions. With this feature, it is easier to determine what was done, when, by whom, and if any investigations are essential with regards to anomalies and security threats.

VMware’s vSphere developed out of complexity and necessity of expanding the virtualization market. The earlier serve products were not robust enough to deal with the increasing demands of IT departments. As businesses invested in virtualization, they had to consolidate and simplify their physical server farms into virtualized ones and this triggered the need for virtual infrastructure. With these VSphere 6.5 features in mind, you can unleash its full potential and usage. Make the switch today to the new and innovative VMware VSphere 6.5.

 

Source: 6 things that you should know about vMwarevSphere 6.5 by thomassujain

Feb 07, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Insights  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> Caterpillar digs in to data analytics—investing in hot startup Uptake by analyticsweekpick

>> Healthcare Dashboards: Examples of Visualizing Key Metrics & KPIs by analyticsweek

>> 8 ways IBM Watson Analytics is transforming business by analyticsweekpick

Wanna write? Click Here

[ NEWS BYTES]

>>
 Five Data Science Job Opportunities At Microsoft India That You Should Not Miss Out On – Analytics India Magazine Under  Data Science

>>
 Big Data Analytics in Banking Market Competition from Opponents, Constraints and Threat Growth Rate … – thebankingsector.com Under  Big Data Analytics

>>
 Survey: What’s Ahead for Martech in 2019? – Franchising.com Under  Marketing Analytics

More NEWS ? Click Here

[ FEATURED COURSE]

CS229 – Machine Learning

image

This course provides a broad introduction to machine learning and statistical pattern recognition. … more

[ FEATURED READ]

On Intelligence

image

Jeff Hawkins, the man who created the PalmPilot, Treo smart phone, and other handheld devices, has reshaped our relationship to computers. Now he stands ready to revolutionize both neuroscience and computing in one strok… more

[ TIPS & TRICKS OF THE WEEK]

Fix the Culture, spread awareness to get awareness
Adoption of analytics tools and capabilities has not yet caught up to industry standards. Talent has always been the bottleneck towards achieving the comparative enterprise adoption. One of the primal reason is lack of understanding and knowledge within the stakeholders. To facilitate wider adoption, data analytics leaders, users, and community members needs to step up to create awareness within the organization. An aware organization goes a long way in helping get quick buy-ins and better funding which ultimately leads to faster adoption. So be the voice that you want to hear from leadership.

[ DATA SCIENCE Q&A]

Q:Do you think 50 small decision trees are better than a large one? Why?
A: * Yes!
* More robust model (ensemble of weak learners that come and make a strong learner)
* Better to improve a model by taking many small steps than fewer large steps
* If one tree is erroneous, it can be auto-corrected by the following
* Less prone to overfitting

Source

[ VIDEO OF THE WEEK]

@chrisbishop on futurist's lens on #JobsOfFuture #FutureofWork #JobsOfFuture #Podcast

 @chrisbishop on futurist’s lens on #JobsOfFuture #FutureofWork #JobsOfFuture #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

The world is one big data problem. – Andrew McAfee

[ PODCAST OF THE WEEK]

@EdwardBoudrot / @Optum on #DesignThinking & #DataDriven Products #FutureOfData #Podcast

 @EdwardBoudrot / @Optum on #DesignThinking & #DataDriven Products #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

According to execs, the influx of data is putting a strain on IT infrastructure. 55 percent of respondents reporting a slowdown of IT systems and 47 percent citing data security problems, according to a global survey from Avanade.

Sourced from: Analytics.CLUB #WEB Newsletter

The pros of using facial recognition technology

In the present times, the implementation of FRT which is the abbreviated form of Facial Recognition Technology is widely used in workplaces and by company owners for eradicating the problems and occurrence of crime. The workplace is not the only area that has seen the inclusion of facial recognition as with time it has become a practical application that is part of human lives. The mobile phone that a person uses can also be enabled with a facial lock that will allow the user to maintain privacy with regard to his/her device.

The concept of facial recognition technology

The buzz about facial recognition and its implementation in various gadgets and spheres of human life can be well understood if the core concept or definition of this technology is known. Facial technology is software used in devices that are operated by artificial intelligence. The facial recognition technology makes use of biometric data which enables the mathematical mapping of the features of a human face. This is done so that it becomes easier to identify a person by his/her digital image. The software makes use of specific algorithms to compare the face of a person with the stored digital image which enables authenticity of the identification process. The other kinds of identifying technologies which are being used along with facial recognition include matching of fingerprints and recognition through voice.
The advantages of using facial recognition technology
When FRT is applied, then the identity of a person is confirmed by applying three methods. Firstly the fingerprints of the person being scanned, secondly the face detection tools are employed, and the third verification is done through comparison. In crowded places too the use of facial recognition is possible and will help in pointing out individuals. There are maniple advantages of using this technology, and some of these are discussed below:

• Automation of the process of identification and identity verification:

In any workplace or business center, numerous individuals are working in different areas at the same time. The presence of security guards at each and every checkpoint for entry and exit will require the employment of multiple security guards. The manual process of checking and identification will also require a considerable amount of time. However, if FRT is used, then the entire process becomes automatic and will save operational cost and time. The process of allowing people to enter the premises of the business organization becomes simplified with FRT installation. Hence extra surveillance for monitoring of cameras won’t be required once the identification process becomes automated.

• Provision of acquiring enhanced security:

Security is a huge factor in almost every place. In organizations breach in security can result in potential damage. Therefore investing in applications that will make the security foolproof is sensible. A biometric system will help a business owner to track anyone who enters the business organization. The system is designed in such a manner that if a person attempts to trespass then immediately an alert will be raised. Along with the alert, the picture of the person who has entered the premise without any permission will also be shown which will help in finding out the person quickly. For correct optimization of a business organization, a person can visit the site of redtailseo.com/las-vegas/.
The presence of security staff might not be able to observe such trespassers because usually people who are entering a premise illegally has some wrong intentions and will, therefore, know a way around to avoid the security guards. However, it is not that easy to avoid a facial recognition security system, and so trespassers will think twice before entering a protected building.

• Superior level of accuracy:

The technology of facial recognition has undergone many changes since its inception and are now equipped with infrared cameras and also enabled with 3D. This has undoubtedly increased the accuracy of the process of identification. It is true that there is a possibility to hoodwink a facial-recognition security system, but it is tough to do so. The level of accuracy also helps in cataloging all the individuals who have visited the facility and in case of any unpleasant situation, the recording done by the software throughout the day can be reviewed.

FRT system – The potential areas that can be worked upon

There is no doubt that facial recognition has changed the way of looking at security, but it also has some problem areas which can be worked upon. These problems are discussed below:

• The angle of the surveillance:

The angle by which the surveillance system is being placed often creates a lot of pressure for ensuring flawless identification. There are numerous angles which have to be used together for uploading a face into the storage system of the software. A frontal image of view is necessary for generating a face template that is clear and recognizable. The photos need to have high resolution, and the more than one angle has to be engaged for capturing the picture. If an intruder uses sunglasses, then it might be difficult for the software to pick up that individual.

• Processing of information and storage of data:

In the digital platform, the main thing that is indispensable is storage. A lot of information that is recorded has to be stored correctly so it can be reviewed later. In facial recognition system, each frame has to be processed, so a group of computers has to be used to minimize the time taken for processing information.

• The quality of the image stored:

Low-quality facial recognition will provide underwhelming results, and so advanced software system has to be used for operating the digital cameras that are responsible for image capturing. In the present system the image is captured from a particular video, and then it is compared with the stored photo, but such comparisons can be flawed as small size images do not provide correct results.

Therefore, it can be understood that there are some areas in case of facial recognition technology which can be improved but the introduction of this software has unquestionably revolutionized the security system.

Originally Posted at: The pros of using facial recognition technology

WPP to grow its data with Dunnhumby bid

Advertising company WPP is expected to buy a percentage of the data analytics firm.

Tesco’s data and analytics firm, Dunnhumby, has roused interest from advertising firm WPP who is reportedly set to make a bid for the data firm.

Plans to sell the firm behind Tesco‘s clubcard had been announced in January, with WPP now confirming that it will be bidding for the company.

big data

Sir Martin Sorrell confirmed the intention at the Deloitte Media and Telecoms conference. Sorrell said that it is important for manufacturers to have the best data on what their customers are doing.

WPP’s history suggests that it will not buy the company outright, as a number of deals for percentage stakes in companies appear to highlight the ad firms tactic to not own firms outright.

Sorrell said at WPP’s fourth quarter earnings call: “We are not big plonkers. We don’t plonk down large amounts of cash to 100% acquire particularly public companies with limitations on how earn-outs can be structured and of course, on people.”

“If you look at these technology companies, a very significant amount of the costs each year are share substitutes. Share-based compensation plays a very big important part, and paying 100% upfront for a company does leave you particularly vulnerable in the big tech area.”

Dunnhumby has been valued at £2bn.

To read the original article on Computer Business Review, click here.

Source: WPP to grow its data with Dunnhumby bid

Jan 31, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Fake data  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> A case for Computer Algorithms and Recursion in Data Science Education by vinny

>> Simplifying Data Warehouse Optimization by analyticsweekpick

>> What’s a CFO’s biggest fear, and how can machine learning help? by wwcheng

Wanna write? Click Here

[ NEWS BYTES]

>>
 Using Big Data to Give Patients Control of Their Own Health – Singularity Hub Under  Big Data

>>
 Big Data Analytics in Banking Market 2025: Global Demand, Key Players, Overview, Supply and Consumption Analysis – Honest Facts Under  Big Data Analytics

>>
 Manual intervention is hindering the customer experience – Chain Store Age Under  Customer Experience

More NEWS ? Click Here

[ FEATURED COURSE]

CS229 – Machine Learning

image

This course provides a broad introduction to machine learning and statistical pattern recognition. … more

[ FEATURED READ]

Machine Learning With Random Forests And Decision Trees: A Visual Guide For Beginners

image

If you are looking for a book to help you understand how the machine learning algorithms “Random Forest” and “Decision Trees” work behind the scenes, then this is a good book for you. Those two algorithms are commonly u… more

[ TIPS & TRICKS OF THE WEEK]

Keeping Biases Checked during the last mile of decision making
Today a data driven leader, a data scientist or a data driven expert is always put to test by helping his team solve a problem using his skills and expertise. Believe it or not but a part of that decision tree is derived from the intuition that adds a bias in our judgement that makes the suggestions tainted. Most skilled professionals do understand and handle the biases well, but in few cases, we give into tiny traps and could find ourselves trapped in those biases which impairs the judgement. So, it is important that we keep the intuition bias in check when working on a data problem.

[ DATA SCIENCE Q&A]

Q:How to efficiently scrape web data, or collect tons of tweets?
A: * Python example
* Requesting and fetching the webpage into the code: httplib2 module
* Parsing the content and getting the necessary info: BeautifulSoup from bs4 package
* Twitter API: the Python wrapper for performing API requests. It handles all the OAuth and API queries in a single Python interface
* MongoDB as the database
* PyMongo: the Python wrapper for interacting with the MongoDB database
* Cronjobs: a time based scheduler in order to run scripts at specific intervals; allows to bypass the “rate limit exceed” error

Source

[ VIDEO OF THE WEEK]

#HumansOfSTEAM feat. Hussain Gadwal, Mechanical Designer via @SciThinkers #STEM #STEAM

 #HumansOfSTEAM feat. Hussain Gadwal, Mechanical Designer via @SciThinkers #STEM #STEAM

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Everybody gets so much information all day long that they lose their common sense. – Gertrude Stein

[ PODCAST OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with @DavidRose, @DittoLabs

 #BigData @AnalyticsWeek #FutureOfData #Podcast with @DavidRose, @DittoLabs

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

The Hadoop (open source software for distributed computing) market is forecast to grow at a compound annual growth rate 58% surpassing $1 billion by 2020.

Sourced from: Analytics.CLUB #WEB Newsletter

True Test of Loyalty – Article in Quality Progress

Read the study by Bob E. Hayes, Ph.D. in the June 2008 edition of Quality Progress magazine titled The True Test of Loyalty. This Quality Progress article discusses the measurement of customer loyalty. Despite its importance in increasing profitability, customer loyalty measurement hasn’t kept pace with its technology. Using advocacy, purchasing and retention indexes to manage loyalty is statistically superior to using any single question alone. These indexes helped predict the growth potential of wireless service providers and PC manufacturers. You can download the article here.

Source: True Test of Loyalty – Article in Quality Progress by bobehayes

Jan 24, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Statistics  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> 2016 Trends in Big Data: Insights and Action Turn Big Data Small by jelaniharper

>> The Point of Advanced Machine Learning: Understanding Cognitive Analytics by jelaniharper

>> Data center location – your DATA harbour by martin

Wanna write? Click Here

[ NEWS BYTES]

>>
 Video Data Security. The view from the experts. – Security Today (press release) (blog) Under  Data Security

>>
 3 virtualization infrastructure design rules to shape your deployment – TechTarget Under  Virtualization

>>
 Don’t Miss the Data Train – MarTech Advisor Under  Social Analytics

More NEWS ? Click Here

[ FEATURED COURSE]

Hadoop Starter Kit

image

Hadoop learning made easy and fun. Learn HDFS, MapReduce and introduction to Pig and Hive with FREE cluster access…. more

[ FEATURED READ]

Data Science for Business: What You Need to Know about Data Mining and Data-Analytic Thinking

image

Written by renowned data science experts Foster Provost and Tom Fawcett, Data Science for Business introduces the fundamental principles of data science, and walks you through the “data-analytic thinking” necessary for e… more

[ TIPS & TRICKS OF THE WEEK]

Finding a success in your data science ? Find a mentor
Yes, most of us dont feel a need but most of us really could use one. As most of data science professionals work in their own isolations, getting an unbiased perspective is not easy. Many times, it is also not easy to understand how the data science progression is going to be. Getting a network of mentors address these issues easily, it gives data professionals an outside perspective and unbiased ally. It’s extremely important for successful data science professionals to build a mentor network and use it through their success.

[ DATA SCIENCE Q&A]

Q:What is your definition of big data?
A: Big data is high volume, high velocity and/or high variety information assets that require new forms of processing
– Volume: big data doesn’t sample, just observes and tracks what happens
– Velocity: big data is often available in real-time
– Variety: big data comes from texts, images, audio, video…

Difference big data/business intelligence:
– Business intelligence uses descriptive statistics with data with high density information to measure things, detect trends etc.
– Big data uses inductive statistics (statistical inference) and concepts from non-linear system identification to infer laws (regression, classification, clustering) from large data sets with low density information to reveal relationships and dependencies or to perform prediction of outcomes or behaviors

Source

[ VIDEO OF THE WEEK]

@AnalyticsWeek: Big Data at Work: Paul Sonderegger

 @AnalyticsWeek: Big Data at Work: Paul Sonderegger

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

We chose it because we deal with huge amounts of data. Besides, it sounds really cool. – Larry Page

[ PODCAST OF THE WEEK]

Unconference Panel Discussion: #Workforce #Analytics Leadership Panel

 Unconference Panel Discussion: #Workforce #Analytics Leadership Panel

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Three-quarters of decision-makers (76 per cent) surveyed anticipate significant impacts in the domain of storage systems as a result of the “Big Data” phenomenon.

Sourced from: Analytics.CLUB #WEB Newsletter

Making Big Data Work: Supply Chain Management

In recent decades, companies have looked to technology, lean manufacturing, and global production to increase efficiency and reduce costs. But these tactics are leading to diminishing returns.

Many companies have moved production offshore, for instance. However, the attractiveness of that opportunity is diminishing as differences in global manufacturing costs between countries such as China and the U.S. have narrowed over the past ten years. (See How Global Manufacturing Cost Competitiveness Has Shifted over the Past Decade, BCG Data Point, May 2014.) At the same time, supply chains have grown more complicated—many spanning multiple continents and involving external suppliers—while customer demands have gotten more complex. As a result, companies are bringing production closer to home markets (“nearshoring”) and sometimes “reshoring” production all the way back home to high-labor-rate countries. (See The U.S. as One of the Developed World’s Lowest-Cost Manufacturers: Behind the American Export Surge, BCG Focus, August 2013.)

The combination of large, fast-moving, and varied streams of big data and advanced tools and techniques such as geoanalytics represents the next frontier of supply chain innovation. When they are guided by a clear understanding of the strategic priorities, market context, and competitive needs of a company, these approaches offer major new opportunities to enhance customer responsiveness, reduce inventory, lower costs, and improve agility.

Companies can optimize distribution, logistics, and production networks by using powerful data-processing and analysis capabilities. They can also improve the accuracy of their demand forecasts, discover new demand patterns, and develop new services by sharing data with partners across the supply chain. In addition, they can increase asset uptime and expand throughput, engage in preventive maintenance of production assets and installed products, and conduct near real-time supply planning using dynamic data feeds from production sensors and the Internet of Things.

Three High-Potential Opportunities

But with so much available data and so many improvable processes, it can be challenging for executives to determine where they should focus their limited time and resources. In our work with supply chain operations across a range of industries, we see three opportunities that offer high potential in the near term. Companies that exploit them can generate significant revenues and profits, as well as reduce costs markedly, lower cash requirements, and boost agility.

Visualizing Delivery Routes. Logistics management challenges all but the most sophisticated specialists in “last-mile delivery.” Traditional routing software at advanced delivery companies can show drivers exactly where and how they should drive in order to reduce fuel costs and maximize efficiency. The most flexible systems can plan a truck’s route each day on the basis of historical traffic patterns. But many ordinary systems still leave a lot to be desired, producing significant slack in schedules and, in many cases, lacking the ability to dynamically visualize and calibrate routes at the street level.

Now, add the difficulty of aligning the deliveries of two or more business units or companies, each of which manages its own delivery system but must work with the others as one. We frequently find that by using big data and advanced analytical techniques to deal with tough supply-chain problems such as these, companies can identify opportunities for savings equal to 15 to 20 percent of transportation costs. Recent advances in geoanalytical mapping techniques, paired with the availability of large amounts of location data and cheap, fast, cloud-based computing power, allow companies to dynamically analyze millions of data points and model hundreds of potential truck-route scenarios. The result is a compelling visualization of delivery routes—route by route and stop by stop.

Consider the challenges experienced during the premerger planning for the combination of two large consumer-products companies. To better model the merger of the companies’ distribution networks, the two companies layered detailed geographic location data onto delivery data in a way that made it possible for them to visualize order density and identify pockets of overlap. The companies learned that they shared similar patterns of demand. (See Exhibit 1.) Vehicle-routing software also enabled rapid scenario testing of dozens of route iterations and the development of individual routes for each truck. Scenario testing helped the companies discover as much as three hours of unused delivery capacity on typical routes after drivers had covered their assigned miles.

exhibit

Splitting the fleet between two local depots in one major city would reduce the number of miles in each route and allow trucks to deliver greater volume, lowering the effective cost per case. After the merger, trucks would be able to make the same average number of stops while increasing the average drop size by about 50 percent. The savings from a nationwide combination and rationalization of the two networks were estimated at $40 million, or 16 percent of the total costs of the companies combined. All this would come with no significant investment beyond the initial cost of developing better modeling techniques.

By establishing a common picture of the present and a view of the future, the geoanalysis also delivered less quantifiable benefits: the results built confidence that the estimated savings generated as a result of the merger would reflect reality when the rubber met the road and would also create alignment between the two organizations prior to the often difficult postmerger-integration phase. However, results such as these are only the beginning. New visualization tools, combined with real-time truck monitoring and live traffic feeds from telematics devices, open up even more exciting opportunities, such as dynamic rerouting of trucks to meet real-time changes in demand.

Pinpointing Future Demand. Forecasting demand in a sprawling manufacturing operation can be cumbersome and time consuming. Many managers have to rely on inflexible systems and inaccurate estimates from the sales force to predict the future. And forecasting has grown even more complicated in the current era of greater volatility in demand and increasing complexity in product portfolios.

Now, companies can look at vast quantities of fast-moving data from customers, suppliers, and sensors. They can combine that information with contextual factors such as weather forecasts, competitive behavior, pricing positions, and other external factors to determine which factors have a strong correlation with demand and then quickly adapt to the current reality. Advanced analytical techniques can be used to integrate data from a number of systems that speak different languages—for example, enterprise resource planning, pricing, and competitive-intelligence systems—to allow managers a view of things they couldn’t see in the past. Companies can let the forecasting system do the legwork, freeing the sales force to provide the raw intelligence about changes in the business environment.

Companies that have a better understanding of what they are going to sell tomorrow can ship products whenever customers request them and can also keep less stock on hand—two important levers for improving operational performance and reducing costs. Essentially, with better demand forecasting, companies can replace inventory with information and meet customers’ demands in a much more agile way. We find that companies that do a better job of predicting future demand can often cut 20 to 30 percent out of inventory, depending on the industry, while increasing the average fill rate by 3 to 7 percentage points. Such results can generate margin improvements of as much as 1 to 2 percentage points.

For example, a global technology manufacturer faced significant supply shortages and poor on-time delivery of critical components as a result of unreliable forecasts. Salespeople were giving overly optimistic forecasts, whose effects rippled through the supply chain as the manufacturer ordered more than was really needed to ensure adequate supply. In addition, the company’s suppliers ordered too much from their own component suppliers. As a result, inventories started to increase across the value chain.

To understand the causes of poor forecast performance, the company used advanced tools and techniques to analyze more than 7 million data points, including shipment records, historical forecasting performance, and bill-of-material records. The company also ran simulations comparing forecast accuracy with on-time shipping and inventory requirements to identify the point of diminishing returns for improved accuracy. The underlying pattern of demand proved complex and highly volatile, particularly at the component level. Root cause analysis helped identify the sources of the problem, which included the usual delays and operational breakdowns, as well as more subtle but equally powerful factors such as misaligned incentives and an organization structure with too many silos.

In response, the company redesigned its planning process, dedicating more time to component planning and eliminating bottlenecks from data flows and IT processing. Furthermore, by improving the quality of the data for the component planners, the company was able to reduce the time wasted chasing data and fixing errors. And it developed more sophisticated analytical tools for measuring the accuracy of forecasts.

On the basis of these and other organizational and process improvements, the company expects to improve forecast accuracy by up to 10 percentage points for components and 5 percentage points for systems, resulting in improved availability of parts and on-time delivery to customers. The changes are expected to yield an increase in revenues, while lowering inventory levels, delivering better customer service, and reducing premium freight costs.

Simplifying Distribution Networks. Many manufacturers’ distribution networks have evolved over time into dense webs of warehouses, factories, and distribution centers sprawling across huge territories. Over time, many such fixed networks have trouble adapting to the shifting flows of supplies to factories and of finished goods to market. Some networks are also too broad, pushing up distribution costs. The tangled interrelationships among internal and external networks can defy the traditional network-optimization models that supply chain managers have used for years.

But today’s big-data-style capabilities can help companies solve much more intricate optimization problems than in the past. Leaders can study more variables and more scenarios than ever before, and they can integrate their analyses with many other interconnected business systems. Companies that use big data and advanced analytics to simplify distribution networks typically produce savings that range from 10 to 20 percent of freight and warehousing costs, in addition to large savings in inventories.

A major European fast-moving-consumer-goods company faced these issues when it attempted to shift from a country-based distribution system to a more efficient network spanning the continent. An explosion in the volume and distribution of data across different systems had outstripped the company’s existing capacity, and poor data quality further limited its ability to plan.

The company used advanced analytical tools and techniques to design a new distribution network that addressed these rising complexities. It modeled multiple long-term growth scenarios, simulating production configurations for 30 brands spread across more than ten plants, each with different patterns of demand and material flows. It crunched data on 50,000 to 100,000 delivery points per key country and looked at inventory factors across multiple stages. Planners examined numerous scenarios for delivery, including full truck loads, direct-to-store delivery, and two-tier warehousing, as well as different transport-rate structures that were based on load size and delivery direction.

Unlocking insights from this diverse data will help the company consolidate its warehouses from more than 80 to about 20. (See Exhibit 2.) As a result, the company expects to reduce operating expenses by as much as 8 percent. As the number of warehouses gets smaller, each remaining warehouse will grow bigger and more efficient. And by pooling customer demand across a smaller network of bigger warehouses, the company can decrease the variability of demand and can, therefore, hold lower levels of inventory: it is volatile demand that causes manufacturers to hold more safety stock.

exhibit
How to Begin

Operations leaders who want to explore these opportunities should begin with the following steps.

Connect the supply chain from end to end. Many companies lack the ability to track details on materials in the supply chain, manufacturing equipment and process control reliability, and individual items being transported to customers. They fail to identify and proactively respond to problems in ways that increase efficiency and address customers’ needs. In order to have big data to analyze in the first place, companies must invest in the latest technologies, including state-of-the-art sensors and radio-frequency identification tags, that can build transparency and connections into the supply chain. At the same time, companies should be careful to invest in areas that add the highest business value.

Reward data consistency. Many companies struggle to optimize inventory levels because lot sizes, lead times, product SKUs, and measurement units are entered differently into the various systems across the organization. While big-data systems do not require absolutely perfect data quality and completeness, a solid consistency is necessary. The problem is that in many companies, management doesn’t assign a high priority to the collection of consistent data. That can change when leaders make the impact of poor data clear and measure and reward consistent standards.

Build cross-functional data transparency. The supply chain function depends on up-to-date manufacturing data, but the manufacturing function may tightly guard valuable reliability data so that mistakes will be less visible. The data could also help customer service, which might inform customers proactively of delayed orders when, for example, equipment breaks down. Data about production reliability, adherence to schedules, and equipment breakdowns should be visible across functions. To encourage people to be more transparent, management might assemble personnel from different functions to discuss the data they need to do their jobs better.

Invest in the right capabilities. Many operations leaders still don’t understand how this new discipline can provide a competitive advantage or how to convert big data into the best strategic actions. Hiring a team of top-shelf data scientists to do analytics for analytics sake is not the answer, however. Companies need to both partner with others and develop their own internal, diverse set of capabilities in order to put big data into a strategic business context. Only then will they be able to focus on the right opportunities and get the maximum value from their investments.


Companies that excel at big data and advanced analytics can unravel forecasting, logistics, distribution, and other problems that have long plagued operations.

Those that do not will miss out on huge efficiency gains. They will forfeit the chance to seize a major source of competitive advantage.


To Contact the Authors:

AMERICAS
EUROPE & MIDDLE EAST

Originally posted via “Making Big Data Work: Supply Chain Management”

Originally Posted at: Making Big Data Work: Supply Chain Management