Amtrak Derailment in Philadelphia surfaces important points likely to be on any technical product development roadmap

2 Color Design Hi-Res The chronicle of a tragedy that befell an AMTRAK commuter train on May 13, 2015 includes points worth consideration by any product marketer working on solutions for process control, and even the Internet of Things (IoT). These points should also be of interest to anyone with a role in an operational risk management (ORM) effort for mechanized mass transport.

Comments on the most prominent of these points, namely AMTRAK’s inability to implement the Positive Train Control service:

Just because a customer has either purchased a solution, or committed resources to a solution, does not mean the customer has taken the steps required to move forward on it. As Jad Mouwad wrote on May 13, 2015 in the New York Times in an article titled Technology That Could Have Prevented Amtrak Derailment Was Absent, Positive Train Control (a complex solution leveraging real time data from sensors to manage the performance of locomotives on rails) ” . . . might have prevented the derailment of a Metro-North commuter train in the Bronx in December 2013 that killed four people and injured dozens . . . ” and the Philadelphia tragedy, as well.

But Mouwad writes ” . . . the absence of the technology has come up repeatedly.” Bottom line: Positive Train Control looked great on paper, but the task of applying it, Mouwad writes, ” . . . involves fitting 36,000 wayside units and equipping 26,000 locomotives according to industry figures.”

The takeaway for product marketers? Putting together a “complexity assessment”, complete with an estimate of likely impact on customer ROI, should be a mandatory feature of a product roadmap.

In turn, and from the customer side of a purchase decision, an internal operational risk management (ORM) effort should also discount the usefulness of a purchase like Positive Train Control based on likely internal obstacles to implementation. Of course the discount should be applied against the ROI expected from the investment. A governance plan should include the steps required to overcome these obstacles to ROI.

If your business is developing solutions like Positive Train Control, but you lack an internal product marketing management effort to craft a promising roadmap for your rollout, please do not hesitate to contact us. We bring to the table over 30 years experience promoting and selling technology solutions (hardware, software, services) to the kind of complex enterprise customer fitting the presentation of AMTRAK (unfortunately), in this example.

We can also help customer organizations looking to improve the performance of ORM functions in order to better prepare for tragedies like this one.

Ira Michael Blonder

© IMB Enterprises, Inc. & Ira Michael Blonder, 2015 All Rights Reserved


The Azure cloud wants to power the Internet of Things

Before turning over the Keynote podium at Microsoft’s Tech Ed Europe 2014 event to another Microsoft Corporate Vice President, Joe Bellfiore, Jason Zander brought two recent success stories to his audience’s attention:

  1. the London Underground, “a user of Azure with IoT”
  2. Coca Cola “working with self service kiosks”, and vending machines

This segue might have resulted from a rebranding of Windows Embedded as the best option for consumers to [c]reate the Internet of Your Things. Or, on the other hand, the segue into a mention of two very large organizations consuming Azure to support enormous populations of smart devices deployed for mission-critical requirements can also be read as a method of branding not only Azure cloud, but also a bunch of new big data SaaS offers designed to run on top of Azure PaaS.

If for no other reason than merely to demonstrate the extensive latitude Microsoft can opt to exercise as it chooses to build out its IoT messaging, readers, in this writer’s opinion, should appreciate the depth of its product offers applicable to this already enormous market segment.

It is worth repeating some earlier comments this writer articulated about the notion of an Internet of Things, namely the concept is neither new, nor especially formidable as one considers the capabilities consumers will likely have to safeguard computing processes running over an enormous number of smart devices all communicating over the same data protocols.

But Microsoft’s now obvious interest in branding itself as a leader in this data communications trend should, to no small extent, provide some reassurances. First, Microsoft’s Visual Studio IDE, and the development methods it supports (Visual Basic, C, C++, Visual C++, etc) have long been used by ISVs supporting the ancestor of this new IoT — namely HMIs, and the families of devices communicating over one of the bus data communications protocols (Modbus, Profibus, Fieldbus, etc), so they already have very important hooks into this market. Second, Microsoft’s experience developing a secure data communications environment to assure enterprise business consumers of the security of cloud computing may apply to the products and solutions they bring to market for IoT consumers.

Ira Michael Blonder

© IMB Enterprises, Inc. & Ira Michael Blonder, 2014 All Rights Reserved


General Electric Steps Into Big Data and Analytics

October 8, and 9, 2014 were a very busy two days for the Public Relations team at General Electric. No less than 4 press releases were published about the first steps this very mature — not to mention very large — business has stepped into big data and analytics.

Consider, for example, how the big data and analytics business at General Electric ramped up to over $1Bil in sales: October 9, 2014, Bloomberg publishes an article written by Richard Clough, titled GE Sees Fourfold Rise in Sales From Industrial Internet. Clough reports “[r]evenue [attributed to analytics and data collection] is headed to about $1.1 billion this year from the analytics operations as the backlog has swelled to $1.3 billion”.

Early stage ISVs looking with envy at this lightning-fast entry should consider how scale, along with a decision to acquire IP via partnerships and acquisitions (rather than opting to build it in-house), and picking the right market made this emerging success story a reality. Let’s start by considering these three points in reverse order:

  1. Picking the right market: GE opted to apply its new tech to a set of markets loosely collected into something they call the “Industrial Internet”. These markets include Energy (exploration, production, distribution), Transportation, Healthcare, Manufacturing and Machinery. Choosing these markets makes complete sense. GE is a leader in each of these already. Why not apply new tech to old familiar stomping grounds?
  2. Leverage partnerships and acquisitions to come to market in lieu of rolling your own: Leading players in each of the markets GE opted to enter expressed burning needs for better security and better insight. Other players in each of the markets (Cisco, Symantec, Stanford University and UC Berkeley) all stand to benefit from the core tech GE brings to the table, so persuading them to partner was likely to have been a comparatively easy task. The most prominent segment of the tech (very promising security tech for industrial, high speed data communications over TCP/IP, Ethernet networks) understandably, came into the package from wurldtech, a business GE opted to acquire
  3. Scale: With GE’s production run rate of turbines, locomotive engines, jet engines, and other complex, massive industrial machinery, the task of finding a home for the millions of industrial sensors required to feed the analytics piece of the tech with the big data it desperately needs, does not look to have been a difficult task. Product management, appropriately, looked into its own backyard to find the consumers required to ramp up to scale in very fast time.

In sum, GE’s entry into this market, if the “rubber hits the road” and metrics bear out claims, looks to be a case study early ISVs should memorize as they plan their tech marketing strategy.

Ira Michael Blonder

© IMB Enterprises, Inc. & Ira Michael Blonder, 2014 All Rights Reserved


As Cyber Attacks Grow in Volume and Intensity, The Long Term Viability of an Internet of Things Should be Reconsidered

Small to Medium Sized Businesses (SMBs) in the U.S. are starting to directly feel the pain of the increased daily volume of cyber attacks, not to mention the malicious intent of the payloads they often include. Whether this pain amounts to persistent, annoying junk email, or the mess resulting from a mistaken click on a link in one of these junk email messages, or worse, the end result is the same — SMBs are growing more aware of the risks inherent to what this writer refers to as our consumerized, mono protocol data communications world.

Anyone with an interest in the Internet of Things marketing communications theme, which has been echoed by a number of participants, from Cisco, to Microsoft and beyond, should take note of what impact, if any, a more skeptical SMB market will have on the success of this effort. Perhaps it is worth taking a sentence or two to explain why the Internet of Things is actually little more than a marketing communications theme.

“Things” have been connected for data communications purposes long before the Internet became the average consumer’s notion of data communications between computing devices over a wide area network. Whether the protocol was one of the buses (MODBUS, PROFIBUS, FIELDBUS, etc), or a serial, RS-232 hardwire connection between a computer running a Human Machine Interface (HMI) application and a remote process, or just a sensor, smart machines have been connected to computers since the mid 1970s.

With many protocols in use for data communications the threat of malicious individuals manipulating data communications sessions was generally limited to someone physically rearranging some wires on a Plain Old Telephone Service (POTS) peg board.

So the Internet of Things, for anyone familiar with industrial automation, and process control, is little more than simply a marketing theme promoted by some of the “also ran” players who did not participate in the birth of Computer Numerical Control (CNC) machining, SCADA, etc.

But what makes this trendy image particularly scary, and what, in this writer’s opinion may amount to a strangely disinterested market should this cycle of hacking go on and accelerate further, is the reluctance of the businesses with a commitment to it to look into diversifying the number of data communications protocols in use, so as to patch the near defenselessness represented by data communications over TCP/IP and web pages called the Internet.

Ira Michael Blonder

© IMB Enterprises, Inc. & Ira Michael Blonder, 2014 All Rights Reserved


Microsoft Evangelizes Public Cloud and Mobile at TechEd 2014

During the Keynote Address of Microsoft’s 2014 TechEd Conference, Brad Anderson, Corporate Vice President, spent a considerable amount of time citing the explosive growth of smart devices and the enormous amount of additional data they produce. Mr. Anderson appeared to be citing these facts as examples of the important drivers for organizations to adopt a cloud data repository architecture, together with matching, Software as a Service (SaaS), computing procedures.

Mr. Anderson pointed out the value of cloud repositories for all of this new data, but not just any cloud repositories, he really focused on public cloud.

The tone of the opening few moments of the Keynote is very much along the lines of an argument from authority. The multimedia content supporting Mr. Anderson’s presentation includes interviews from unnamed authorities (presumably staff at Microsoft). Each interview is a testimony either to the value of all the new data produced on a device-to-device basis, or to the usefulness of a ubiquitous cloud repository for the data and related computing processes. One interviewee even states “Don’t be afraid of it, just jump on for the ride”

Whether at a subliminal level, or consciously, anyone viewing the presentation will likely perceive the presentation as an effort to encourage usage of the cloud, along with acceptance of the value of smart devices, and all of the data they produce. Perhaps the reason for the evangelical tone of the presentation is some resistance in the TechEd community (which is made up of “IT Professionals and Enterprise Developers”) to public cloud computing resources, and, perhaps, big data, which is the stuff produced by all of the smart devices cited in the presentation.

If my assumption is correct (and I have nothing substantive from Microsoft® to indicate it is correct), then it is also safe to infer the enterprise organizations supported by these professionals and developers have expressed a reluctance to embrace public cloud computing offers.

The impact on the big picture of what this all may mean for Redmond’s product marketing plan is as follows:

  1. There is a need to encourage faster adoption of the usefulness of public cloud computing on the part of enterprise businesses
  2. Redmond benefits when more customers use Microsoft’s public cloud offers
  3. Redmond is firmly seated on the “Internet of Things” bus

Ira Michael Blonder

© IMB Enterprises, Inc. & Ira Michael Blonder, 2014 All Rights Reserved


On the Need to Set Boundaries Around an Internet of Things

The March/April 2014 edition of Foreign Affairs includes an article titled As Objects Go Online, which was written by Professor Neil Gershenfeld of MIT, and JP Vasseur, Cisco Fellow and Chief Architect Internet of Things at Cisco Systems.

This article appears to have been published to coincide with a one day IOT Festival held on Saturday, February 22, 2014, on the MIT campus in Cambridge, Massachusetts.

While the enthusiasm of the authors is to be applauded, and the promise of increasing the scope of what I would call “rapid device to device data communication” (which presently depends entirely on one data communications transport — Ethernet, with a set of markup languages running at the application layer) is certainly an important objective (which, should we achieve it, will certainly expand the usefulness of devices, along with the range of what people can do with them), I think a lot of caution should be exercised about the entire notion.

Tellingly, it isn’t until approximately 5 paragraphs from the end of “As Objects Go Online” that the authors address the question of whether it makes sense, from the perspective of data security, to open the Smart Grid to data communications over the Internet of Things, which they champion. In light of the recent exposure of the Heartbleed security hole in the Open SSL protocol, in my opinion, the following claim by Gershonfeld and Vasseur should be very carefully considered by anyone seriously considering the “open” SmartGrid notion: “The history of the Internet has shown that security through obscurity doesn’t work. Systems that have kept their inner workings a secret in the name of security have consistently proved more vulnerable than those that have allowed themselves to be examined — and challenged — by outsiders. The open protocols and programs used to protect Internet communications are the result of ongoing development and testing by a large expert community.” (quoted from Gershonfeld and Vasseur’s article as published on the Foreign Affairs web site).

In the next paragraph they present their argument on the real cause of many of the “Internet” / “Web” serious security problems–human error. I certainly agree with this claim, which points to the predominant role played by human error, poor procedural planning, and a lack of effective risk management when one reflects over the history of successful, malicious attacks conducted over “The Web”. But this is, by no means, to excuse what can only be called shoddy software development at the foundation of the heartbleed problem. Procedures and controls are useless, even when correctly implemented, if the Open Source software the authors laud is, itself, full of holes and bugs.

As I wrote recently in this blog, in my opinion we need much better methods at the transport and application layers of the data communications protocol stack to ensure, at a minimum, the suitability and security of software before we condone using it for something as mission critical as the SmartGrid.

Ira Michael Blonder

© IMB Enterprises, Inc. & Ira Michael Blonder, 2014 All Rights Reserved


Building a Data Security Model for the Internet of Things

Two executives from Cisco jointly presented a Keynote at this year’s RSA Conference in San Francisco. The Keynote was titled The New Model of Security Christopher Young, Senior Vice President, Security Business Group, and Padmasree Warrior, Chief Technology and Strategy Officer spoke for 26 mins on the topic of the Internet of Things and its impact on data security best practices.

Online security is, and, for the foreseeable future will remain, one of the most important components of any mature ISV’s product platform. Cisco is no exception. But this presentation at the RSA Conference did not provide me with a lot of new information about how Cisco is meeting the challenge.

Cisco has, on a few occasions, created brands for purported industry trends, which somehow never got off the ground. Examples include the Home Technology Integration (HTI) effort, which didn’t deliver on its promise. Is the Internet of Things just another example of one of them?

Regardless of how one answers the question, the important point about the notion of an Internet of Things for this Keynote, is simply the geometric, explosive proliferation of connected devices over the last thirty years. Warrior presented some statistics including a universe, in 1980, of approximately 1K devices, which, today, she claims is approaching (or even exceeding 10 Billion).

Christopher Young depicted the problem all these devices represent to ISVs with security solutions: when the connected device is a highly complex machine like an automobile, then anyone analyzing the points where the connected device is vulnerable to malicious attack, needs to think about sub systems, component manufacturers, etc. In other words, the real conundrum is ensuring all of the OEMs contributing to the production of the final complex connected device are all sharing the same security priorities, architectures, etc.

Young did not offer any examples of how anyone is successfully coordinating OEMs to provision a truly effective security solution for connecting complex devices like automobiles to the Internet, but, one can argue, at least Cisco is aware of the challenge, which is an important starting point.

There is ample precedent for such as policy, of course, within the production of the functional architecture of automobiles and, on an even bigger scale, airplanes. Boeing, Airbus, etc. are quite effective at managing subsystems, and the OEMs responsible for them, to ensure conformance with functional standards. Why not do the same for Internet connectivity?

Warrior also noted a need for device-to-device authentication, which I think makes a lot of sense. Ethernet, unfortunately, does not support the data communications hand shaking required to provide this level of authentication, but Warrior’s comment may actually signal efforts on Cisco’s part to build new data communications protocols on top of, our beneath, Ethernet over TCP/IP communications capable of simulating the type of error checking and authentication required to really control data communications between connected devices.

Ira Michael Blonder

© IMB Enterprises, Inc. & Ira Michael Blonder, 2014 All Rights Reserved


ProofPoint Uncovers Successful Malicious eMail Activity and Finds Security Holes in the Internet of Things

On January 16, 2014 ProofPoint published a press release, titled ProofPoint Uncovers Internet of Things (IoT) Cyberattack. According to the company, some 750K “Phishing and SPAM emails” were uncovered through ProofPoint’s efforts. The sources of these attacks were traced back to a set of home entertainment centers, televisions and “at least one refrigerator”.

This information should help people interested in the notion of IoT to better understand the range of devices included in the scope of the first significant hack attempt on this type of data communications.. Conspicuously absent from the list of compromised devices included in the release are smart thermostats, electric meters, HVAC systems or even home security systems. But it is increasingly likely the attackers will soon begin to penetrate HVAC systems, etc. Certainly the risk of successful attempts to compromise an HVAC system is a magnitude greater, even than the risk of a rogue smart refrigerator sending spam emails.

The ProofPoint release also helps us better understand why hackers are targeting IoT devices. The malicious exploits amounted to efforts to turn smart appliances into broadcast resources for junk email, and phishing attempts. The objective is clearly nefarious as ProofPoint’s release points out: “Cyber criminals intent on stealing individual identities and infiltrating enterprise IT systems have found a target-rich environment in these poorly protected internet connected devices that may be more attractive and easier to infect and control than PC, laptops, or tablets.” (quoted from ProofPoint’s Press Release. I’ve provided a link to the complete press release earlier in this post).

Consumers of these smart appliance and home convenience devices may want to read ProofPoint’s release before connecting one of them to the Internet. Investors keen on the IoT trend may also want to read the release, if for no other reason than to get a sense of the magnitude of a negative black swan event, and its potential destructive damage on businesses marketing IoT solutions.

Ira Michael Blonder

© IMB Enterprises, Inc. & Ira Michael Blonder, 2013 All Rights Reserved


What is Google’s Acquisition of Nest All About?

Back on April 1, 2013 I wrote a post to this blog about the Nest home thermostat, The Learning Feature of the Nest Thermostat is Interesting, But the Zigbee Internals and Compatibility with Smart Meters is More Important. On May 7, 2013, I interviewed Kate Brinks of Nest Labs and wrote a follow on post, which I published on June 4, 2013, Nest Labs Acquisition of MyEnergy Makes Sense.

My interest in Nest Labs grow out of my first hand experience in the Smart Home effort of 2003 – 2004. I was also directly involved with some entrepreneurs working on Radio Frequency Identification (RFID), and Industrial Ethernet hardware. I had frequent conversations with several of the early pioneers of industrial process control solutions, and built a couple of business relationships, which transpired over the next few years (please contact me for the specific names).

But the assumption I included in my first post on this device, the one about bi-directional data communications capability with the Smart Grid, was not correct. Nest Labs bought the technology when they acquired MyEnergy, as I wrote in the second post on the topic.

The point is, from what I found through my conversation with Kate Brinks, and my own research, Nest Labs didn’t have especially deep experience, nor did it have especially deep technical understanding of the industrial side of automated process controls systems, at least back in the spring of last year. It’s very likely they’ve invested in hiring this expertise since then, so I’ve little doubt of their technical capabilities, now, to do some original, pioneering work in this area.

I think Google acquired the company to bring the superb technical product management Nest Labs exhibited with the original debut of its home thermostat, in house. Disclaimer: I have had no conversations with anyone, neither at Nest Labs, nor at Google, to support this conjecture, but I can’t help but think this acquisition is Google had a burning need to acquire the best product management expertise with consumer hardware devices they could find. Makes sense when we consider their Chromecast product line, Google Play, the MotoX smart phone effort, etc.

But I think all the talk about the “Internet of Things” and what this acquisition will do for the effort, is pure conjecture and off target. As I wrote elsewhere in this blog, Cisco tried drumming up all this effort 10 years ago with little lasting success. I don’t see the latest iteration going any further. The real key to the Smart Home is to be found elsewhere, somewhere closer to the industrial automation and process control technologies required to make it work.

Ira Michael Blonder

© IMB Enterprises, Inc. & Ira Michael Blonder, 2013 All Rights Reserved


The Internet of Things Does Not Need a New Common Language

Over the last several months Cisco, Salesforce.com, General Electric and other businesses have published a lot of promotional information about a new concept – the “internet of things”. I’ve written earlier on this topic to voice two opinions:

  1. the notion is nothing new. Process control and industrial automation have existed as similar efforts for over 40 years. Numerous reliable methods exist, today, to enable washing machines, home thermostats, air conditioners, security systems, etc. to communicate, bi-directionally, over Ethernet. Most of these methods support markup languages, like HTML, and scripting languages, like Javascript, at the application layer
  2. the leaders of the initiative aren’t likely to succeed. Cisco also played a significant role in the “smart home” movement, with little lasting success. Salesforce.com’s role looks like a diversionary tactic to obscure the real pressing issue for their business, namely attrition in subscription rates.

It’s time for me to add a third opinion: An internet of things does not need a new common language. On January 7, 2014, Nick Biltin of the New York Times published an article on Wolfram Wants to Connect the Internet of Things. But Modbus, Fieldbus, Profibus, and Devicenet have each existed for years, are completely suitable for Ethernet data communications, and chock full of the “thing” specific features and components any language purporting to support an internet of things would need to be helpful and effective.

So all this talk, in my opinion, is yet more evidence of why the champions of this initiative are, once again, going about their work in completely the wrong way. In parallel fashion to the “smart home” initiatives of 2001-2005, they are completely disregarding a working, reliable platform fully capable of handling the “things” they claim need to be connected.

Most consumers are ignorant of industrial automation and process control, which is neither a credit to our educational system, nor to the analysts covering efforts to enable “dumb” devices with logic. Perhaps some of the publicity about this initiative should be redirected to credit the internet of things that already exists and works very well, thank you.

Ira Michael Blonder

© IMB Enterprises, Inc. & Ira Michael Blonder, 2013 All Rights Reserved