Enterprise Strategy Group | Getting to the bigger truth.TM

Posts Tagged ‘Oracle’

Oracle Server Virtualization: The Quiet Killer Technology

Tuesday, November 9th, 2010

While Wall Street and the IT industry remain ga-ga over VMware, the company has some serious challenges ahead. According to ESG Research, large organizations are embracing server virtualization technology for workload consolidation, but continue to struggle with more sophisticated server virtualization implementation issues like:

  1. Performance management. Server virtualization creates numerous integrated “moving parts” when applications running on dedicated servers suddenly move to shared hardware.
  2. Complex multi-tiered application deployment. Server virtualization is fine for discrete workloads controlled by IT but application owners are much more cautious. This is especially true when it comes to applications that depend upon multiple horizontal and vertical tiers. Furthermore, middleware is particularly dicey since it anchors all application-to-application communications.

Given these complexities, many firms simply eschew server virtualization for complex mission-critical applications and point VMware and Xen at more basic workloads. This helps cut capital costs and optimizes hardware but it doesn’t really change IT fundamentals.

It is these very complex application workloads where Oracle has a distinct advantage. Rather than starting with server virtualization and looking up the technology stack, Oracle starts at the business application and looks down. In this way, Oracle can align its portfolio of business applications, development tools, and middleware with tight integration for server virtualization. Oracle is already doing this with WebLogic Server Virtualization Option and Oracle Virtual Assembly Builder. Oracle also tightly integrates these suites on its virtualization technology without the need for an operating system. Finally, Oracle is plugging its application and infrastructure management tools into server virtualization as well.

Server virtualization and cloud nirvana comes down to simple and automated provisioning and configuration of an application “stack” that includes business apps, middleware, databases, operating systems, networks, and storage. Once deployed, real-time management kicks in to ensure availability, security, and high performance. Oracle hasn’t got all of these pieces but it appears to me that it has more of them than others.

In my view, Oracle has one other distinct advantage. Server virtualization is a deployment option as far as Oracle is concerned so Oracle’s server virtualization market share won’t make or break the company or its multitude of business units. VMware on the other hand is anchored to virtualization so it must evolve its technology from its hypervisor roots into an enterprise and cloud computing platform in order to drive further growth and scale. VMware may be up to this task but Oracle has a much more straightforward server virtualization path ahead.

The CIA and the Encrypted Enterprise

Friday, October 29th, 2010

The international horse show wasn’t the only event in Washington DC this week; I participated in the Virtualization, Cloud, and Green Computing event in our nation’s capital. One of the guest speakers was Ira “Gus” Hunt, CTO at the CIA. If you haven’t seen Gus speak, you are missing something. He is very strong on the technical side and extremely energetic and entertaining.

Gus focused on cloud computing activities at the CIA (I’ll blog about this soon), but I was intrigued by one of his slide bullets that referred to something he called the “encrypted enterprise.” From the CIA’s perspective, all data is sensitive whether it resides on an enterprise disk system, lives in a database column, crosses an Ethernet switch, or gets backed up on a USB drive. Because of this, Hunt wants to create an “encrypted enterprise” where data is encrypted at all layers of the technology stack.

The CIA is ahead here, but ESG hears a similar goal from lots of other highly regulated firms. When will this happen? Unfortunately, it may take a few years to weave this together as there are several hurdles to overcome including:

  1. An encryption architecture. Before organizations encrypt all their data, they have to understand where the data needs to be decrypted. For example, remote office data could be encrypted when it is sent to the corporate data center, but it needs to be decrypted before it can be processed for large batch jobs like daily sales and inventory updates. There is a balancing act between data security and business processes here demanding a distributed, intelligent encryption architecture that maps encryption/decryption with business and IT workflow.
  2. Key management. Most encryption products come with their own integrated key management system. Many of these aren’t very sophisticated and an enterprise with hundreds of key management systems can’t scale. What’s needed is a distributed secure key management service across the network. Think of something that looks and behaves like DNS with security built in from the start. The Key Management Interoperability Protocol (KMIP) effort may get us there in the future as it is supported by a who’s who of technology vendors including EMC/RSA, HP, IBM, and Symantec, but it is just getting started.
  3. Technical experience. How should I encrypt my sensitive Oracle database? I could use Oracle tools to encrypt database columns. I could encrypt an entire file system using Windows EFS or tools from vendors like PGP. I could buy an encrypting disk array from IBM, or I could combine EMC PowerPath software with Emulex encrypting Host-based Adapters (HBAs). Which is best? It depends on performance needs, hardware resources, and financial concerns like asset amortization. Since there is no “one-size-fits-all” solution here, the entire enterprise market is learning on the fly.

A lot of the technical limitations are being worked on at this point, so the biggest impediment may be based upon people and not technology. We simply don’t have a lot of experience here, so we need to proceed with research, thought, and caution. To get to Gus Hunt’s vision of the “encrypted enterprise,” we need things like reference architectures, best practices, and maturity models as soon as possible. Look for service providers like CSC, HP, IBM, and SAIC to offer “encrypted enterprise” services within the next 24 months.

IBM To Buy Brocade And Other Stupid M&A Rumors

Thursday, September 23rd, 2010

I was at Oracle Open World yesterday when I heard the rumor that IBM was going to buy Brocade. At the time, I was meeting with a group that had collective industry experience of more than 100 years. We all laughed this off as hearsay.

The fact is that IBM already OEMs equipment from Brocade (as well as Juniper) so it is not lacking in engineering experience or alternatives. Does IBM want to start a stand-alone networking business? Does it want to OEM Fibre Channel switches to and HP? Does it want to bet on Brocade/Foundry Ethernet switches against the rest of the industry? No, no, and no.

This is not the only silly rumor we’ve heard lately. Last week, Microsoft was going to buy Symantec. Yeah sure, there are no antitrust implications there. And does Microsoft really want to buy a company that has about a dozen products that are redundant to its own?

How about Oracle buying HP? Larry may be spinning this up for fun, but it’s simply crazy talk. Oracle, a software company focused on business applications and industry solutions, wants to get into the PC and printer businesses? Yeah, I know, “What about servers and storage?” To which I answer, “What about Sun?”

These rumors are circulating because of the recent uptick in M&A activity, but my strong bet is that nothing remotely similar will happen. The rumors must then be coming from one of two sources:

  1. Wall Streeters executing a “pump and dump” play. Given the activity in Brocade’s stock yesterday, this is likely. I hope the SEC is all over this unethical practice.
  2. Bloggers and Tweeters trying to “stir the pot.” Maybe the Internet has become the great equalizer between intelligent discourse and ignorance.

Not all mergers make sense, but there tends to be some business logic inherent in most transactions. Let’s try and remember that before spreading rumors for personal or unethical gain.

First Impressions from Oracle Open World

Wednesday, September 22nd, 2010

I’m here in San Francisco for Oracle Open World. Just arrived, but I already have some first impressions.

  1. There are signs, billboards, and brochures boasting about Oracle’s commitment to integrated hardware and software. This is the ultimate irony to an industry old-timer like me as Oracle led the open systems charge in the 1990s, lambasting Digital Equipment and IBM for its autocratic systems control. I’ll have to poke around for some old Oracle ads and compare them to its new integrated stack mantra.
  2. VMware is here and if I stop by its booth, I get a free espresso. Funny thing is that as far as I know, Oracle doesn’t support its apps or databases running on top of VMware. Based upon ESG Research, I believe that within two years or so, large organizations will run Oracle virtualization infrastructure to run Oracle workloads next to VMware, Xen, or Hyper-V workloads. When this happens, I can’t imagine VMware will be very visible at Oracle Open World, let alone spring for coffee.
  3. Rumor has it that Oracle will either announce its own Ethernet switch or buy one of the remaining independents. Personally, I hope Oracle doesn’t go down this road and decides to work with everyone else.
  4. I fully expect Oracle to jump much deeper into the security waters. This is becoming a requirement for being in the systems business.
  5. Michael Dell spoke this morning about virtualization in the data center. Application and database folks used to refer to this stuff as “plumbing.” Why do they care now? Because distributed applications with huge Hadoop backends need to be tuned with virtual servers, networks, and storage IO in mind.
  6. I’m looking for Oracle to be one of the leaders that transform today’s monolithic enterprise-focused identity management technology to a more Web-friendly democratic model.
  7. It’s funny that after Larry ripped apart cloud computing as nothing but industry hype, many vendors are here at OOW preaching, you guessed it, cloud computing.

More soon, time to walk the floor and get indoctrinated.

Networking and Virtualization Vendors Should Join the Open vSwitch Effort

Thursday, September 16th, 2010

My colleague Mark Bowker and I are knee-deep in new research data on server virtualization. Within this mountain of data, we are discovering some existing and impending networking issues related to network switching.

Today, many server virtualization projects are led by server administrators, with little or no participation from the networking team. As you may imagine, this means that the server team configures all virtual switches to the best of its ability, without considering how physical switches are already configured. As things scale, the server team realizes the error of its ways and quickly calls the networking group in to help out. This is where things really break down. Before doing anything, the networking folks have to learn the virtualization platform, understand how the physical and virtual networks should interoperate, and then roll up their sleeves and start gluing everything together.

This is a painful learning curve but I believe that future issues will be far more difficult. As organizations increase the number of VMs deployed, networking configurations get more difficult — especially when VMs move around. Users regularly complain about the number of VLANs they have to configure, provision, and manage. This situation will grow worse and worse as VMs become the standard unit of IT.

In my mind, it makes no sense for virtualization vendors like Citrix, Microsoft, Oracle, and VMware to recreate the richness of physical L2 switches in the virtual world. So what can be done? Well one alternative is to eliminate virtual switches entirely and do all switching at the physical layer via the Virtual Ethernet Port Aggregator (VEPA) standard being developed in the IEEE.

I believe this will happen but in the meantime there is another alternative being discussed this week at the Citrix Industry Analyst Event — Open vSwitch. As described on the Apache web site, “Open vSwitch is a multilayer virtual switch licensed under the open source Apache 2.0 license. The goal is to build a production quality switch for VM environments that supports standard management interfaces (e.g., NetFlow, RSPAN, ERSPAN, CLI), and is open to programmatic extension and control.”

Here’s why this makes sense to me:

  1. Given a pool of collective resources, a collaborative open effort would provide more advanced switching functionality sooner rather than later.
  2. An open alternative would expose APIs that could be easily integrated with leading switch management tools from Brocade, Cisco, Extreme, Force 10, HP, Juniper, etc.
  3. Vendors would not have to integrate with each hypervisor independently. This would improve code quality and again speed time-to-market.

At the very least, Citrix, Microsoft, and Oracle should back this as a way to push back on VMware’s marketshare lead.

I’ve been around long enough to know the strengths and limitations of open source and standards but I think that with the right support, this one could have legs. I know that vendors have their own businesses to look after but isn’t another end goal to create products that the market wants? I think Open vSwitch would fit this bill.

HP Buys ArcSight: More Than Just Security Management

Monday, September 13th, 2010

The waiting and guessing games are over; today, HP announced its intent to buy security management software leader ArcSight for $1.5 billion. I didn’t think HP would pull the trigger on another billion+ dollar acquisition before hiring a new CEO, but obviously I was wrong.

ArcSight is a true enterprise software company. As I recall, many of the early ArcSight management team members actually came from HP OpenView. With this model in mind, ArcSight went beyond technology and invested early in top field engineers, security experts, and sales people. This vaulted the company to a leadership position and it never looked back.

For HP, ArcSight fits with its overall focus on IT operations software solutions for Business Technology Optimization. In the future, security information will be one of many inputs that helps CIOs improve IT management and responsiveness. It won’t happen overnight, but think of all sources of IT management data (i.e., log data, SNMP, network flow data, configuration data, etc.) available for query, analysis, and reporting in a common repository. This is what HP has in mind over the long haul.

In the meantime, HP should get plenty of ArcSight bang-for-the-buck over the next 12-24 months by:

  1. Aligning ArcSight and EDS. Security is a top activity within professional services firms. Given ArcSight’s enterprise play, EDS will likely double down on IT risk management and push ArcSight wherever it can.
  2. Using ArcSight as a door opener in the federal market. Yes, HP already sells plenty of products and services to Uncle Sam, but it now has access to a CISO community with deep pockets. With CNCI 2.0 and FISMA 2.0 upon us, this will only increase.
  3. Bringing ArcSight into the virtual data center strategy. According to ESG Research, many enterprises don’t do a good job of coordinating security with server virtualization. This is a big problem given virtualization growth — which is why VMware was so vocal about its recent vShield announcement. HP can and should bring ArcSight into its strategic vision for CIOs with massive data center projects.

In spite of its security services and thought leadership, HP’s name has been notably absent from IT security leadership discussions in the past. ArcSight should change that.

A few other quick thoughts:

  1. In the past, ArcSight was built exclusively on top of Oracle databases. Great in terms of enterprise functionality, but it made the product expensive to buy, expensive to operate, and somewhat weak in terms of queries across large data sets. Look for HP to accelerate plans to decouple ArcSight from Oracle ASAP.
  2. If HP is still in buying mode, the obvious question is, “who is next?” Would anyone be surprised if HP made a move for Check Point, F5, or Riverbed soon?

Friday, September 3rd, 2010

Anyone remotely interested in identity management should definitely download a copy of the National Strategy for Trusted Identities in Cyberspace (NSTIC) document. It can be found at this link: .

A a very high level, the strategy calls for the formation of a standards-based interoperable identity ecosystem to establish trusted relationships between users, organizations, devices, and network services. The proposed identity ecosystem is composed of 3 layers: An execution layer for conducting transactions, a management layer for identity policy management and enforcement, and a governance layer that establishes and oversees the rules over the entire ecosystem.

There is way more detail that is far beyond this blog but suffice it to say the document is well thought out and pretty comprehensive in terms of its vision. This is exactly the kind of identity future we need to make cloud computing a reality. Kudos to Federal Cyber coordinator Howard Schmidt and his staff for kicking this off.

I will post my feedback on the official website, but a few of my suggestions are as follows:

  1. Build on top of existing standards. The feds should rally those working on things like Project Higgins, Shibboleth, Liberty, Web Services, Microsoft Geneva, OpenID, etc. Getting all these folks marching in the same direction early will be critical.
  2. Get the enterprise IAM vendors on board. No one has more to gain — or lose — than identity leaders like CA, IBM, Microsoft, Novell, and Oracle. Their participation will help rally the private sector.
  3. Encourage the development of PKI services. PKI is an enabling technology for an identity ecosystem but most organizations eschew PKI as too complex. The solution may be PKI as a cloud service that provides PKI trust without the on-site complexity. This is why Symantec bought the assets of Verisign. The Feds should push Symantec and others to embed certificates in more places, applications, and devices.

There will be lots of other needs as well. The document recommends identity and trust up and down the technology stack but it doesn’t talk about the expense or complexity of implementing more global use of IPSEC, BGPSEC, and DNSSEC. There is also the need for rapid maturity in encryption, key management, and certificate management. Good news for RSA, PGP, nCipher (Thales), IBM, HP, Venafi, and others.

The key to me is building a federated, plug-and-play, distributed identity ecosystem that doesn’t rely on any central authority or massive identity repository. This is an ambitious goal but one that can be achieved — over time — if the Feds get the right players on board and push everyone in the same direction.

WSJ Reports Imminent Sale of ArcSight: Handicapping the Suitors

Thursday, August 26th, 2010

An industry friend just sent me a story from the Wall Street Journal proclaiming that security management leader ArcSight will be acquired within the next week. The story goes on to say that the likely buyers include Oracle, HP, , IBM, and CA.

Hmm. First of all, anyone familiar with ArcSight was sure this was coming. The company is a leader in a growing market segment, has a great Federal business, and is one of few real enterprise players. It is interesting to me that the Wall Street Journal is spreading rumors but that’s another story.

Let me weigh in by handicapping the field:

  1. Oracle. This would be a bold strategic move as Oracle plays in security tools and the identity management space, but not the broader security market. ArcSight is an enterprise software company so it fits with Oracle sales and channels. ArcSight also runs on an Oracle database (for better and for worse). To me, Oracle makes sense as a potential suitor.
  2. HP. HP people always tell me that they want to be in the security services, not the security products business. The company backed this up when it sold its identity management portfolio to Novell. ArcSight fits with OpenView/Opsware as enterprise software so it may have changed its mind, but HP probably wants to be careful with acquisitions in the wake of the Mark Hurd scandal. Heck, HP put in a bid for 3PAR this week and Wall Street went nuts. Given these factors, I’d be surprised if it were HP.
  3. EMC. Forget this rumor. EMC already bought one of ArcSight’s primary competitors (Network Intelligence, now RSA EnVision). There are a dozen security acquisitions I could think of that would make more sense for EMC/RSA.
  4. IBM. Great fit in terms of enterprise software but this would be IBM’s third security management offering (the original Tivoli security manager and then GuardedNet which IBM got as a result of the Micromuse deal). Neither of these products have really resonated in the market. If anyone can erase two previous products, IBM can. I rate this one as likely as Oracle.
  5. CA. CA’s security presence is really limited to the identity space. Like IBM, CA has tried several times to penetrate the security management market with little success. I can see CA wanting ArcSight but if Oracle or IBM jump in, the price may quickly get too high for CA.

Given the Intel deal, McAfee is likely out of the running. I’ve heard through the grapevine that McAfee made several attempts at ArcSight but the price tag was just too big. Symantec, like IBM and CA, has also developed security management products that haven’t taken off in the market. If Enrique Salem is up for another big acquisition, ArcSight would be a great fit.

Finally, wherever ArcSight ends up, there are plenty of other innovative security management companies that may quickly follow. Feisty Q1 Labs would be a natural for Juniper. Brainy Nitro Security could be a fit for Cisco or CA. LogRhythm could be a good addition for HP, Check Point, Websense, etc.

ArcSight deserves what it gets as it really guided the security market moving forward. Its fate will greatly influence the enterprise security market moving forward.

Lieberman Cybersecurity Bill: Fatal Flaws and What the IT Industry Must Do

Monday, June 21st, 2010

While it may seem like cybersecurity issues have taken a back seat in Washington, there is actually a lot of work happening on Capitol Hill. Senate majority leader Harry Reid (D, NV), is pushing all Senate committees with any type of cybersecurity or industry oversight to get on their legislative horses and address the existing mess.

To that end, Senator Joseph Lieberman (I, CT) is working with colleagues Susan Collins (R, ME) and Thomas Carper (D, DE) on a fairly comprehensive cyberseurity bill called the Protecting Cyberspace as a National Asset Act. The bill seeks to revamp the paper-centric FISMA Act of 2002, centralize cybersecurity management in DHS, and establish a more proactive public/private partnership for cybersecurity risk management.

The essence of the bill is certainly welcome. We need to address cybersecurity issues ASAP like President Obama promised he would do more than a year ago. Unfortunately, the Lieberman bill has a few significant flaws, in my opinion. One major problem is with the bill’s link to federal procurement. The Lieberman bill seeks to legislate security in federal IT spending by “creating a system that requires acquisition officers in the federal government to have the knowledge that they need about the vulnerabilities in products.” This in itself is a good idea but:

  1. How do you do this? There is some talk in Washington about insisting that vendors pass some type of security certification that governs their development processes and cyber supply chain assurance model. Okay, but this certification doesn’t exist today and certification can be nothing more than a check box exercise like FISMA is. In the current state of the industry, this requirement is ludicrous.
  2. Product vulnerabilities are one ingredient. The Lieberman bill’s focus on product vulnerabilities hearkens back to cybersecurity issues circa 2004 when it was fashionable to blame Microsoft for all security problems. Yes, these remain important but we need to think about system vulnerabilities (i.e., a superset of product vulnerabilities), comprehensive testing, and a lot more security training.

I don’t claim to be an expert on the Lieberman bill but it seems to me that we are falling into the old Washington scapegoat mentality of looking for a villain (i.e., the IT industry). Don’t get me wrong, lots of vendors should be called to task for unacceptable security practices but these provisions seem overly simple or impossible to enforce to me.

While the Feds figure out the next act in the cybersecurity play, it is really up to the IT industry to step up and establish its own security best practices and self-certification methodology. Strong examples already exist from vendors like , HP, IBM, and Oracle. While some folks will certainly flame me for saying so, Microsoft’s SDL is also a model for the rest of the industry.

Legislators are caught between a rock and a hard place. They have to do something but these are uncharted and highly technical waters. This being the case, the IT industry has to do a better job of stepping in and demonstrating leadership. If this doesn’t happen, the U.S. IT industry will face difficult, costly, and confusing legislation that could impact financial results for years to come.

Heterogeneous Server Virtualization

Wednesday, June 2nd, 2010

Hats off to VMware for its leadership in server virtualization. That said, I am hearing more and more stories about heterogeneous server virtualization in large organizations.

Does this mean that VMware is faltering? No. As virtualization has gone from cutting edge to mainstream, however, IT organizations are gaining virtualization experience, testing other platforms, and finding good fits for KVM, Microsoft, Oracle, and XenServer–alongside VMware.

At the beginning of 2010, ESG Research asked 345 mid-market (i.e.,  less than 1,000 employees) and enterprise (i.e., more than 1,000 employees) firms which server virtualization solutions were currently deployed at their organizations.  The data supports the scuttlebutt I heard on my recent travels:

  • VMware Server 38%
  • Microsoft Virtual Server 36%
  • VMware ESX: 32%
  • Citrix XenServer 30%
  • VMware ESXi: 18%
  • Oracle VM 19%
  • Microsoft Hyper-V: 17%
  • Sun xVM Server: 10%
  • KVM: 9%

Based on anecdotal evidence, I don’t think this is a phase–it looks like multiple server virtualization platforms in the enterprise is the future. What does this mean?

  1. Server virtualization will get more complex. IT will need specialization on multiple platforms.
  2. Vendors need to pick multiple dance partners. VMware is clearly a safe bet but IT infrastructure and tools vendors need to think beyond VMware alone. Microsoft and Citrix will likely recruit partners with an endpoint focus while KVM and Oracle will be more of a data center play.
  3. Services opportunities abound. IT complexity and skills deficits are on the horizon. Services vendors that can bridge these gaps will prosper in 2011 and 2012.
Search
© 2010 Enterprise Strategy Group, Milford, MA 01757 Main: Fax:

Switch to our mobile site