When disaster strikes, horrific images of the damaged environments and upsetting stories of the lives impacted and lost flood media networks across all platforms. We are lucky to live in a world where there are thousands of organizations and their dedicated employees who are among the first to respond to the devastated locations. However, it is important to also note that everyday citizens are also anxious and curious as to how they can help whether on the ground or from a distance. We have seen, disaster after disaster, everyday citizens from around the world coming together to show support through monetary donations, canned food and clothing drives, and even hashtags.
Suzanne Bernier’s book, Disaster Heroes, captures the incredible journeys of everyday citizens who assist in a crisis situation. Bernier shares the stories of a man in Louisiana raising funds to send a fire truck to replace damaged equipment from 9/11 to a New York City Fire Department in Brooklyn, to a Pennsylvania-based drilling company sharing valuable knowledge and equipment with the 2010 Chilean Mine Collapse effort to extract the 33 trapped miners, to a Hudson River ferry captain acting on his immediate instincts and professional experience to aid in several relief efforts. The book captures an important theme seen amongst both formal and informal actors in crisis response:
“There are no borders when it comes to disasters. We’re all in this together.”
This mantra is what demonstrates the goodness in our globalized society and I would argue it also illustrates the necessity of cooperation in crisis response. As people from all over the world aim to assist a crisis-impacted community, the borders, whether they are geographical, cultural, functional, or temporal, should not disrupt effective cooperation and instead should foster partnership and support.
What physical or conceptual borders have you experienced that have challenged effective cooperation in the crisis environment?
Why are so many companies across a diverse set of industries investing in and around the Internet of Things? Everywhere I go, every blog I read … I sound like my favorite band from the 80s: the Internet of Things is watching me.
In reality, it’s the reverse: I’m seeing the Internet of Things (Iot) everywhere: companies investing in sensors, networking and applications with the expectation that this investment will increase revenues, lower costs and improve profitability over the short and long term.
While the term the Internet of Things was coined in 1999 by Kevin Ashton at Procter & Gamble, the mainstream application of IoT is just getting started. As the trend has heightened, I’ve been evaluating the potential for IoT to support better decision making in travel and transportation.
My experience in the travel and transportation industry has always been about using analytics to support decision making. In fact, I started my career as a pricing strategy analyst at American Airlines. And now I’m fully converted. You can call me an evangelist for the IoT in transportation, especially because the potential to take data coming from the IoT, incorporate purposeful analytics and reach better decisions quickly, is significant.
I am speaking directly about why I have become so enamored with the ability of the IoT to deliver business value, especially in transportation. It’s really quite simple: the IoT delivers value through the data that the things provide for decision-making.
This data is collected by sensors and devices on railcar components, semi-truck engines, or other elements within the transportation value chain. And it is now available on a real-time, or streaming, basis. This provides the ability to learn from trends in that data quickly and act upon those trends within seconds.
Gone are the days where data is collected over the span of weeks or months, sent to an analyst for review, and – after another week or two of data crunching – the analyst presents a report or PowerPoint to her boss with recommendations for changes.
As Tom Davenport said:
To make the Internet of Things useful, we need an Analytics of Things. This will mean new data management and integration approaches, and new ways to analyze streaming data continuously.
So, in order to take advantage of the streaming big data (and it is big data by every definition of the phrase) coming from the sensors, we must reconsider how we use analytics. Remember, that I didn’t say that the analytics themselves must change. In most cases, we can use the same analytics applied to streaming data as we used in a batch model.
What we need is a good understanding of where we apply the analytics: on the edge, at rest or in the middle. Let me explain:
Analytics on the edge means analysis at the specific device or sensor.
Analytics at rest means data pulled out of the stream and used for high-performance analytic model development
Analytics in the middle takes place on data as it’s streaming. Some analysts have called this middle ground “the fog,” and it’s relevant because it can be a combination of the streaming data itself enriched with sitting data such that we can detect more complex events sooner.
We have now arrived at a different place in the analytic continuum. The optimal analytics experience is a multi-stage analytics experience. It includes continuous queries on data in motion and at the edge, with incrementally updated results. This new process moves analytics from centralized data warehouses to edge analytics, which are closer to the occurrence of the events.
What does multi-stage analytics of IoT data look like? It happens fast (seriously, we are speaking about microseconds or msecs at this point) and at very high volumes. It requires specific business rules that give instructions on whether to save, discard, aggregate, transform or enrich the streaming data without overloading the entire system or network. Multi-stage analytics includes pre-determined data mining, decision making, alerting, scoring, and profiling of the data to exploit the value of the streaming data. And, it might also include managing the data differently – creating “out of order” handling to make the data source streams understandable to the analytics and the decision-makers.
We have all the building blocks in place to exploit the value of the Internet of Things and the Analytics of Things: sensors, or assets, creating data, the communications network connecting the data and the analytic and computing applications that make use of the data flowing to and from the things.
The Internet of Things can be transformative in transportation operations: in maintenance and engineering, you will have more information sooner, which means you can predict the maintenance needs of individual assets before failures occur and proactively service assets at an opportune time when your asset is near a repair facility. This reduces costs across your operations. In supply chain situations, you can monitor inventory levels on a near real-time basis, develop better forecasting models and optimize this inventory, when and where you need, lowering supply costs, increasing efficiency and enhancing revenue opportunities. On the customer side of transportation, you can enhance the customer’s experience by providing real-time forecasts of arrivals and notifying them sooner if delays occur. And, happier customers are loyal customers.
The Internet of Things opens up tremendous opportunities for transportation companies, generating significant streaming data which can be relevant for decision-making. However, it is critical to apply the appropriate analytics to streaming data in order to derive value from that data.
Multi-stage analytics is not rocket science; it’s simply the judicious application of the right analytics at the right time in the right place to the right data, which is what you need to exploit the value of the Internet of Things. That is why I’ve become an Internet of Things and Analytics of Things evangelist.
This content is reposted with permission from SAS Voices, where
the original post appeared.
Using a “Value Added” Approach in Supply Chain Forecasting
By IDB Guest Blogger: Michael Gilliland, SAS
Materiel and supplies are critical to the men and women of the U.S. Armed Forces, at home and abroad. Whether it’s ammunition, weapons, IT, or toilet paper, if they need it, someone is in charge of getting it and making sure there is sufficient inventory. The Government Accountability Office has identified this mission-critical role – supply chain management – as an area for improvement for the U.S. Department of Defense (DoD), particularly inventory management through better forecasting. As DoD works to improve its supply chain forecasting capabilities, this is an area where DoD can learn from the mistakes of the private sector.
The reality is that forecasting can be a huge waste of management time. This is not an indictment of the practice of forecasting as a whole, but rather of how organizations usually approach and apply forecasting incorrectly.
The problem is not that forecasting is pointless, irrelevant, or unnecessary. Rather, the problem is supply chain leaders squandering too much time and too many resources on forecasting with a myriad of bad practices.
Tools and Methods
In some cases, the technologies organizations use for demand planning and forecasting are outdated or simply misapplied. By relying on outdated tools or methodologies, these forecasters miss the progress made in recent years to improve accuracy, reduce bias, and minimize the cost of forecasting through large-scale automation. Even heroic efforts on their part are likely to deliver underwhelming results, whereas an unpoliticized and unbiased forecast can lead to cost savings and, more important, can save lives.
The goal of forecasting is to obtain an objective, dispassionate number that is as accurate as can reasonably be expected given the nature of whatever you happen to be forecasting. Rather than working from this perspective, however, many managers and forecasters have unrealistic expectations for the level of accuracy achievable. They rely too heavily on the current “fit” of models to history when their job is to forecast the future. Almost invariably, the forecast will be less (often much less) accurate than the fit to history. (It is always much easier to explain the past than to predict the future.)
Don’t Trust the Process
Perhaps the most blatant example of waste in the forecasting process is “forecasting by committee.” This is where a forecast is passed through so many different stages of approval and has been tweaked by so many collaborators that its integrity is actually degraded. It is easy to see how this approach could occur in any bureaucratic environment – including the military – where each participant has a personal agenda they express with their adjustment.
The problem is that this kind of elaborate review process ends up being extremely costly – in two ways. First, because the process is unnecessarily consuming everyone’s time. And second, because your outfit may actually be in a worse position than if you had not attempted to incorporate so much “management intelligence” into the forecast in the first place. In a study of eight commercial supply chain companies, Steve Morlidge (author of the book Future Ready) found less than half their forecasts were more accurate than the “naïve forecast” – i.e., the forecast you get by doing nothing and simply using the last available data point as your future prediction.
Know Your Limits
Even if you use the proper tools and methods to forecast future supply chain needs, it is important to realize that overall forecastability still limits the maximum possible accuracy of forecasts. Supply chain leaders need to avoid demanding a level of forecast accuracy that is simply impossible to obtain because they have not considered the nature of what they are trying to forecast.
To illustrate this concept, suppose your job is to forecast Heads or Tails each day in the tossing of a fair coin. While you may have some lucky streaks and forecast correctly several days in a row, over the long haul your forecast will be correct just 50% of the time. It doesn’t matter if your ranking officer demands 60 percent accuracy or higher – you are limited to a 50 percent accuracy ceiling by the nature of the behavior you are trying to forecast. Not even bigger computers and more sophisticated software will help – there is nothing anyone can do to achieve 60% accuracy. Unachievable objectives motivate forecasters to simply give up, or find a way to cheat and game the system.
Volatility and Forecast Accuracy. Scatterplot: Mike Gilliland.
As seen in this scatterplot of 5000 items being forecast by a consumer goods manufacturer, the variability or “volatility” of a demand pattern has a big impact on how well we can expect to forecast it. Smooth, stable, repeating patterns can be forecast quite accurately with simple methods. But wild, volatile, erratic patterns may never be forecast accurately, no matter how many resources we commit to forecast them. To the extent that we can control and limit volatility, we are likely to achieve more accurate forecasts.
Solution: Knowledge is Power
Private sector companies are increasingly using a method called “Forecast Value Added (FVA) Analysis” to improve performance of their forecasting processes. FVA is the application of basic science to evaluate a forecasting process – measuring each step in the process, and determining whether it is “adding value” by making the forecast more accurate and less biased. Here is an example of an FVA report (in this case, the Analyst Override step is just making the forecast worse!):
Example: Forecast Value Added (FVA) Analysis. Table: Mike Gilliland.
Forecast Value Added analysis is about rooting out the waste and inefficiency from forecasting efforts (so is consistent with a lean approach to supply chain management). It allows organizations to streamline their process and redirect the non-value-adding efforts into more productive activities that will be more beneficial to the mission at hand. Understanding cautionary tales of corporate sector mistakes and challenges can help the DoD create real improvements in defense supply chain management. By avoiding wasteful steps and procedures, defense organizations have the opportunity to achieve better forecasts with less effort, and less cost.
Michael Gilliland is an IDB guest blogger with more than 15 years of forecasting and supply chain management experience in the food, apparel, and consumer electronics industries. Gilliland has been featured in the LOGTECH Advanced Program in Logistics and Technology. He is author of The Business Forecasting Deal, and has published articles in Supply Chain Management Review, Foresight: The International Journal of Applied Forecasting, Journal of Business Forecasting, Analytics, and APICS magazine. He is also an editor for Foresight. Gilliland holds a BA in philosophy from Michigan State University and master’s degrees in philosophy and mathematical sciences from Johns Hopkins University. He writes The Business Forecasting Deal Blog at blogs.sas.com/content/forecasting. For more than a decade, SAS and the IDB have worked collaboratively to raise the level of awareness of the value analytics to defense leaders.
When Tom Davenport and D.J. Patil suggested in their article published in Harvard Business Review that the data scientist is the “sexiest profession of the 21st century,” these in-demand professionals became part of the discussion surrounding the big data ground swell.
On April 10, 2015 at the 3rd Annual Business Analytics Forum at Indiana University’s Kelley School of Business the Institute of Defense and Business (IDB) and SAS had an opportunity to address over two hundred students, faculty members and industry partners who have a vested interest in grooming these superstars of the future.
The Kelley School’s Institute for Business Analytics , co-chaired by Vijay Khatri and Frank Acito, was one of the first programs of its kind established to prepare students for careers in business analytics. The Institute hosts this conference to make their students savvy in real world analytics and bridge the gap between academia and the real world. They actively seek out and engage with industry partners who provide internships and other pathways to employment.
This year’s conference topics included the Internet of Things (IOT), supply chain analytics, and healthcare analytics. The IDB, SAS, Deloitte and IBM were invited to inject the importance of analytics to government.
Government Analytics Jobs
Government is a heavy user of analytics and acutely feels the pain of the analytics skills gap across all agencies. A 2014 GovLoop survey reported that 96 percent of those surveyed – 46 percent of whom self-identify as experts or analysts – believe their agency has a data skills gap
Last year’s audience gave the thumbs up to our “Data vs. Gut” panel discussion, which featured retired military officers giving their perspective on the evolution they’ve seen of analytics use in decision making in defense organizations. This year, due to unforeseen circumstances, our defense panelist was not able to make it. Van Noah, panel moderator, smoothly shifted gears and reworked the flow of the conversation to provide the students in the audience with a broad preview of how they can apply their data skills in government analytics jobs.
If you were not able to see the discussion in person, here is a summary of the content these experts were able to shed light on:
Since this panel discussion was held just five days before the income tax filing deadline of April 15th – and stories about scammers were prolific in the media — what better subject to open the conversation?
Van Noah from IDB was able to represent the government sector and weave in some thoughts on how the military could better use analytics to select candidates for intensive training programs. For example, pilot training is expensive, so using analytics to identify students who are likely to wash out early is much safer and more cost effective than letting all students progress to the next phase of training.
Satish Lalchand from Deloitte has a long history of working with analytics in government and provided examples of how the federal government is leveraging analytics to combat fraud and improper payments. He also shared his perspective on how agencies can get started once they see the business need for analytics and what it takes besides analytics to make better decisions.
Eric Zidenberg from SAS has been involved with public safety organizations for many years and talked about how analytics are currently being used on the southwest border to make better decisions on which cars should be sent secondary inspection for illicit materials.
Dion Rudnicki from IBM segued nicely into talking about how the Memphis Police Department was using analytics to better allocate resources to reduce crime. In a city previously identified as #1 for crime in the US, and where there are only 2,500 police officers to protect the population of 650,000, analytics allows government to place the right resources at the right places at the right time. The result: a 30% drop in crime.
Next Generation Data Scientists
Students preparing for careers in data science have a lot of options when they enter the workforce. The data analytics talent gap exists in every sector in our global economy. When students weigh their options, they should consider the jobs that support the public sector. Serving government will give them a chance to make a real difference. Government needs these talented, data scientists of the future!
Gail Bamford is an IDB guest blogger and has over 30 years of experience working in the public sector IT market. She has been with SAS since 2006 and is passionate about helping close the analytics talent gap. For more than a decade, SAS and the IDB have worked collaboratively to raise the level of awareness of the value analytics to defense leaders.
As the U.S. Army draws down after nearly fifteen years of war, the service is reflecting on lessons learned as well as looking ahead to future conflicts and challenges. A central product of this analysis is the new Army Operating Concept called “Win in a Complex World.” Published October 2014, this document from Army Training and Doctrine Command seeks to characterize the nature of the world we now live in (unpredictable, quickly evolving, and increasingly complex) and shift attention away from weapons, technologies and systems in favor of a focus on the capabilities at all levels (tactical, operational, and strategic) that will be needed to win future conflicts.
The Army Operating Concept took center stage in Huntsville, Alabama at the Association of the U.S. Army’s Global Force Symposium. Serving as the keynote speaker for the symposium, General Ray Odierno, Chief of Staff of the Army, highlighted the challenges the United States is facing around the world; from West Africa to North Korea, the U.S. is confronted with a broad range of enemies and enemy capabilities. Many of these threats are long-term in nature and will require sustained operations to bring them to an end. At the same time, the Army and all other services are entering an era of tightened budgets which, according to the service chiefs, including GEN Odierno, severely threatens the Army’s readiness and modernization efforts. It is these modernization efforts that are needed to achieve the capabilities needed to win in a complex world.
Given the lessons learned from the past fifteen years and the budgetary difficulties that lie ahead, the Army is indeed at what GEN Odierno described as “a strategic reflection point.” Tremendous challenges await the service and its major commands. However, GEN Odierno reminded the audience that “it’s people who win wars,” and that the U.S. Army remains the world’s best land force because of its men and women in uniform who are out in the world making a difference.
Almost every session at AUSA ILW Global Force Symposium centered on the Army Operating Concept. Do you think the AOC will drive real change in the Army, or will it fall victim to various constraints such as contracting, personnel systems, and the ever-changing budget situation? Share your thoughts in the comments below.