Anton Prenneis Technology Evangelist EMC OEM Solutions
On May 30, I attended the annual TiECON East (#TiECONEast) conference in Cambridge, MA. The conference was organized by “TiE” (The Indus Entrepreneurs), which describes itself as the largest, global, not-for-profit organization promoting entrepreneurship. This year’s conference, themed “Breaking Boundaries”, featured technical panel discussions on timely topics ranging from the Internet of Things to robotics, Cloud security, and connected healthcare. One panel, entitled, “Data is the New Oil”, dealt with the question of how to “refine” and monetize Big Data. Like oil, data is a hugely valuable resource. But unlike oil, the ability to extract value from data is not very well understood, and a vast amount of data remains “untapped” (or, to put the latter point another way, at the risk of overextending the analogy – we will never reach “peak data”). Questions were posed to five panelists – including a panelist from EMC Big Data spinout, Pivotal. Here are the key points I came away with: 1) How can Big Data be sold to business unit executives?
The “prep for the future” argument doesn’t work: Business executives have many priorities competing for budget and attention, and most of those priorities are near-term. Even though they may “get” the message that their competitors might gain an advantage using Big Data, preparing for the future is always something that can (and usually does) wait for tomorrow.
Listen for the pain points: Many execs will tell you their Big Data challenges without necessarily realizing that they are. A mobile executive talking about subscriber churn; A utility exec talking about electricity theft; A logistics exec talking about fleet fuel costs — these execs might not view these as problems that Big Data can solve, but as it turns out, analytics is being used today to address all of these issues. It’s our job to listen and to think about how Big Data can be applied.
In some industries regulatory compliance can be a key driver: Understand how analytics can be used to help CFOs meet the regulatory requirements in their industries (e.g., HIPPA, Sarbanes-Oxley, Basel, etc.).
In most industries, incremental growth is the main driver: Whether it’s saving costs through efficiency or penetrating untapped markets with new products and services, the most exciting opportunities are the ones that promise to introduce whole new sources of top- and bottom-line growth.
2) Do companies have Big Data, or just big problems with the data they have?
Many companies are grappling with “analysis paralysis”. A familiar example is a utility company trying to figure out what to do with the avalanche of new data being generated by newly deployed smart meters. Another example, cited in the panel discussion, is a mobile operator trying to identify problem apps on mobile devices in order to decrease support center calls.
It is our job to help these companies discover the use cases. Look at the short-, medium-, and long-term value that a company can extract from its data. To use the utility example, a near-term use case might be to simply understand daily and seasonal electricity usage patterns among its subscriber base. A medium-term use case might be to use that new understanding to create energy efficiency incentive programs for subscribers. And a long-term use case might be to work with city planners to design automation into buildings, enabling them to adapt usage behavior to minimize the CO2 output of a regional grid.
3) What are some Big Data do’s and don’ts for IT?
Don’t: Try to be a futurist. Do: Focus on agility:
Trying to read the minds of business stakeholders isn’t going to deliver the Big Data solutions they need. Instead, use the tools available to “fail fast” and iterate. Faced with such a huge volume, variety and velocity of data, trying out many combinations quickly to see what works is the only way to deliver on the fourth v of Big Data: value.
Don’t: Be a “middleman” between your users and their data. Do: Maximize speed:
A common time-consuming task in traditional BI environments is the movement of data from one data repository (usually a production repository) to another so that it can be analyzed. Tools and platforms now exist to obviate this need. With in-memory data fabrics that enable real-time data analysis, and storage platforms that allow a single “data lake” to be accessed using different protocols, including HDFS, it is increasingly possible to analyze data in place and return results much more quickly.
Don’t: Think all data is equal or that security is all about authentication and encryption. Do: Be aware of the unique traits of your data, and yourusers.
A good example here is geofencing of data. For a variety of reasons, ranging from privacy to regulatory to performance, you might wish to store data in a particular region, country or datacenter. Another example is understanding the access patterns of your users. An increasing number of security breaches today are “inside jobs” (think of Edward Snowden). Security is no longer just about ensuring that untrusted people cannot access your systems. Security is also about ensuring that trusted people don’t access data they shouldn’t. Can your systems spot anomalies such as a user accessing a database multiple times in one day that she or he has never accessed before? If not, your data is at risk.
4) If Big Data is the “new oil”, then automation is the “new plastic”.
Automation is where the true power of analytics, particularly real-time analytics, comes into play. It is commonly agreed that “data scientist” is one of the hottest new fields. The belief is that someone with a unique set of skills, combining statistics with depth of expertise in a particular business domain, is needed to separate the wheat from the chaff in business data and generate real insight. While this is true, it is also true in many domains that nobody has the time to do analytics on all the data coming in. There’s just too much of it. And furthermore, some decisions need to be made instantaneously (you wouldn’t, for instance, want a data scientist making braking decisions for your self-driving car!).
Big Data problems are becoming “Fast Data” problems. And these problems are being addressed by technologies such as Gemfire from Pivotal, which provide the tools necessary to build decision models into memory so that actions can be taken instantaneously. But it is very early days for this kind of technology, and there was agreement on the panel that automated decision making represents the next wave of Big Data innovation.
A lot of good stuff here to think about for entrepreneurs and big tech companies alike. And judging by the energy in the standing-room-only room at this session, I’d predict that we’re going to be seeing an enormous amount of innovation in this space! (P.S. My apologies for the “new plastic” analogy. That was mine…)
EMC is a dedicated global organization that understands the design driven business and your unique needs. We are able to deliver comprehensive programs and services that allow you to differentiate your products and services form your peers. Watch the video below to learn more.
EMC OEM Solutions
On location at EMC World 2014, Anton Prenneis gives an overview of EMC’s OEM business. Topics include the expansion of EMC OEM Solutions across the EMC Federated Business portfolio to reach new lines of business, helping partners to build vertical-specific solutions, and how new strategies will help EMC’s OEM partners generate incremental revenue.
EMC OEM Solutions
In just over a month, more than 70,000 people from around the world will descend upon Barcelona, Spain for the annual Mobile World Congress conference. MWC is, without a doubt, the top global conference focused on the mobile telecom business. The EMC-Pivotal-VMware Federation will have a prominent presence at MWC, showcasing the new Real-Time Intelligence for Telcos (RTI-T) offering from Pivotal, and communicating how EMC and VMware are the ideal partners for Telcos making the journey to a more software-defined world (i.e., for all Telcos). Joe Tucci will be making a keynote and Paul Maritz, Pat Gelsinger, John Roese, Jeff Nick, Paul Davey, and other key executives will be on-hand to meet with customers, partners and prospects.
It is timely, then, that David Goulden’s vision of trends in the Telco industry should be published in the most recent edition of Connect-World, a leading magazine for the Telco and IT industries. In the article, David discusses how disruptions (and opportunities) tend to affect the Telco and IT industries in parallel, and how the two major trends of Network Functions Virtualization and Big Data will transform the industry – creating new winners and losers in the process.
As the abstract reads: “Today the telco industry is at the vortex of change due to developments, such as network functions virtualization and big data analytics. By allying with IT to embrace and transcend the disruptions characterized by these developments, telecom providers stand to benefit from reduced costs and new revenue streams, and see their profits grow.”
Recently made available for sale, version 1.20 of EMC ScaleIO is a scale-out, software only, server based SAN solution that converges storage and compute, and provides linearly scalable performance.
ScaleIO was recently acquired by EMC and is now part of the software-defined storage portfolio within EMC. Both product and service offerings from ScaleIO differ from ViPR, the previously announced EMC software-defined solution. While ViPR software allows the end user to abstract storage and all its unique capabilities from physical arrays into a single pool of virtual storage, ScaleIO software allows the end user to combine storage and compute into a single entity. So what is it and why is it considered by some as foundation of storage? In order to understand it, let’s look at fundamentals of how a storage appliance works.
Part 3 in a series of articles on EMC’s recently announced OEM-Ready VNX family of Unified Storage
Following up on my previous post on the new game-changing “VNX family” family of Unified Storage, I wanted to elaborate more upon the use of SSD and FAST (Fully Automated Storage Tiering) suite, as it allows OEM’s to balance performance while lowering total cost of ownership or TCO.
While previous VNX architecture was designed to add SSD’s to a traditional mechanical HDD array, new VNX architecture is designed with SSD in mind first while still allowing use of traditional mechanical HDD to SSD arrays. The end result is that it allows customer flexibility to optimize TCO with tiered service level while delivering a great $/IOPS that scale while improving performance up to 3x.
Part 2 in a series of articles on EMC’s recently announced OEM-Ready VNX family of Unified Storage
I wanted to follow up on my recent blog on the new game-changing “VNX family” of Unified Storage and elaborate more on performance/ firepower that are of interest to OEM’s.
As we know from the famous Moore’s Law, the CPU processing power doubles every 18 months. With increasing CPU power, more and more virtual machines are now running on any given server. This, in turn, requires a storage array to deliver more firepower and performance than ever before. As an example, cloud based computing or unified communication system running on a virtual environment will now require storage that can deal with more transactional traffic (IOPS) with lower latency than ever before.
In order to unleash greater system performance, the new VNX family uses clean sheet design, which uses a combination of new multi-core technology, branded as MCx and storage array designed with Flash based SSD in mind. Unlike the previous architecture, with new MCx architecture, software can truly take advantage of multiple cores. In addition, while the previous architecture was designed to add SSD’s to a traditional mechanical HDD array, the new design allows traditional mechanical HDD to be added to SSD arrays. The end result is truly spectacular.
Building on the tremendous success of the award-winning VNX, today EMC announced the immediate availability of the next generation VNX Series of unified storage. The new VNX solution is truly a game changer and revolutionizes midrange storage, delivering unprecedented levels of application performance, capacity efficiency, data protection, and ease of use.
The VNX Family serves environments of all sizes with a single platform supporting block, file, and object storage and is available as separate series for smaller and larger customers—the VNXe series and the VNX series. To learn more please visit www.EMC.com/VNX
Key benefits of the New VNX family are as follows;
Single platform supporting block, file, and object storage that deliver the transactional performance of a traditional midrange array at 1/3 the price.
Drives up to 4X more virtual machines, more transactions, more file operations, and more virtual desktops—all at an affordable price
Improves file performance for transactional applications by up to 3X with 60 percent better response time
Reduces storage capacity requirements with FAST Suite improvements and block deduplication by up to 50 percent
Moves beyond high availability to continuous operations within and across data centers with active-active and VPLEX
Improves management productivity with self-service IT, increased visibility, and tight integration with VMware vSphere and Microsoft Hyper-V
In the highly globalized world where technology and regional requirements change very fast, quite often I get a question from OEM customers and fields alike on assurance on extended product life, spares availability, and service life. It is one of the most critical decisions for many OEM’s, especially ones who are playing in emerging markets and are looking to modernize their infrastructure, but at the same time want to amortize their investment cost over an extended period and in addition are looking for extended service life. Extended product and service life also helps an OEM play in a developed world. No surprise here—it is often the first question asked before the consideration even begins.
EMC OEM Solutions
First in a series of posts focusing on the Carbon War Room report: “Machine-to-Machine Technologies: Unlocking the Potential of a $1 Trillion Industry”
1.) The revolution in human communications is spurring a second revolution: the ability to transmit, analyze and act on machine-generated data.
2.) Dramatic size and growth projections of the M2M market over the next decade suggest a huge economic opportunity for high-tech vendors.
3.) At the same time, the efficiencies offered by M2M technologies can lower greenhouse gas emissions without adversely impacting economic growth.
4.) EMC’s federated businesses (EMC, RSA, Pivotal, VMware) and our OEM customers stand to benefit from a joint, comprehensive M2M strategy.
Do you believe in magic?
Stop for a moment to consider where we are in history today. On March 7, 1876, Alexander Graham Bell was issued U.S. patent number 174,465 for discovering that a voice could be transmitted across a wire immersed in a conducting liquid. And now, 137 years later (a mere blip in time), humans are able to connect to each other from anywhere on earth, over wires or wirelessly – nearly instantaneously. This is nothing short of revolutionary.
The opinions and interests expressed on Dell EMC employee blogs are the employees' own and do not necessarily represent Dell EMC's positions, strategies or views. Dell EMC makes no representation or warranties about employee blogs or the accuracy or reliability of such blogs. When you access employee blogs, even though they may contain the Dell EMC logo and content regarding Dell EMC products and services, employee blogs are independent of Dell EMC and Dell EMC does not control their content or operation. In addition, a link to a blog does not mean that EMC endorses that blog or has responsibility for its content or use.