Talkin’ to the project lead of OpenStack Object Storage

July 18, 2010

The first code that is available from the OpenStack project, and its available today, is the code for the storage effort, “Object Storage.”  The man at the technical helm of this effort is Will Reese of Rackspace.  Will’s daytime job is development manager and system architect for Rackspace’s Cloud Files, the source of the code for Object Storage.  Will and I grabbed some time at last week’s design summit and he briefed me on the project:

Some of the topics Will tackles:

  • Object Storage is based on the open sourced code from Rackspace’s Cloud Files.
  • What attracted NASA to Cloud Files (think scale).
  • Rackspace will lead the project to get the community kick started but is looking for the community to take over.
  • Storage and Compute will each have their own tech boards made up of members from Rackspace, NASA and the community.
  • In the second half of the interview Will takes us through a quick overview of the cloud files architecture which is written in python, leverages eventlib, and borrows concepts from memcache and some  key-value stores –>  To learn more, check out Will’s talk at OSCON this Wednesday.

Extra-credit reading

Pau for now…


Introducing OpenStack — an open source cloud platform

July 18, 2010

Today Rackspace and NASA announced OpenStack, an open source cloud platform that they are collaborating on and building a community around.  Last week the inaugural OpenStack design summit was held here in Austin with 20 companies from around the world, including Dell, participating.

During one of the breaks I grabbed sometime with Rackspace’s cloud president, Lew Moorman to learn more about the effort and get his thoughts:

Some of the topics Lew tackles:

  • What is OpenStack (an opensource set of technologies for building clouds…)
  • Why Rackspace decided to opensource their code .
  • How Rackspace got hooked up with NASA and what each brings to the party.
  • Taking Nebula’s core foundation and adding some elements from Rackspace’s side in order to put together a release candidate that should be available to the community this Fall.

Extra-credit reading:

Pau for now…


NASA’s chief cloud architect talks OpenStack

July 16, 2010

At the inaugural design summit for OpenStack, an open source set of technologies for building clouds, Nebula’s chief architect Josh McKenty played a prominent role in leading the assembled folks.  I caught Josh during a break and chatted with him about Nebula and NASA’s role in the newly announced OpenStack project.  Here’s what he had to say:

Some of the topics Josh tackles:

  • What is Nebula (hint: NASA’s, primarily IaaS, cloud computing platform)
  • The history of Nebula and how it morphed from nasa.net.
  • Why NASA wants a cloud – and the importance of having an elastic set of resources.
  • NASA and Nebula’s use of open source and how it has evolved (they don’t simply fling tarballs over the wall anymore and they can use licenses other than the “NASA open source agreement”)
  • A match made in heaven:  NASA has put together a strong compute platform and was looking to building a real object store,  Rackspace had a strong object store and work looking for a new compute platform.

Extra-credit reading:

Pau for now…


Cloud Billing and the Subscription Economy

July 15, 2010

A couple of weeks ago I was out in the Bay Area to attend Giga OM’s Structure event.  One of the interesting individuals I met there was Tien Tzuo, CEO and founder of Zuora.  Zuora, which counts salesforce.com’s Marc Benioff as an investor, “bills” itself as the leader in subscription billing and payment solutions.  Here is what Tien had to say:

Some of the topics Tien tackles:

  • Zuroa as the “Salesforce.com for online billing” (they are a hosted SaaS offering with a point and click interface.
  • Their focus on companies that have a recurring subscription based business model e.g. SaaS, cloud, online media…
  • Their launch of Zcommerce for the cloud which provides billing, reoccurring payment capability and subscription management capability for cloud environments.
  • The great variety of pricing and packaging models when it comes to cloud payment: an arrears model, a minimum volume commitment model a pre-pay model etc.
  • The concept of moving from a product-centric society to “The subscription economy,” you see it in transportation, music, computing …

Extra-credit reading

Pau for now…


Cool Article on the Dell/Azure announcement

July 14, 2010

Monday, as part of Microsoft’s big Azure announcement, we announced that we would be both building an Azure appliance, enabling customers to build their own public or private clouds, as well as developing an Azure public cloud at Dell that our customers can use to develop and deploy next generation services on.

There has been a ton of press surrounding this move by Microsoft to broaden the market for Azure, an effort which also includes similar agreements with HP and Fujitsu. Not surprisingly, my favorite article is one by Charles King that came out yesterday in eCommerce Times — Microsoft’s Windows Azure and Dell: Blue Skies Ahead.

Check out these excerpts and you’ll see why 🙂

Dell is out of the blocks and running with Azure while its rivals are still sorting out their gym bags.

Dell’s cloud efforts tend to be one of the company’s best kept secrets. Some vendors’ continual cloud pronouncements tend to blend into a vuvuzela-like drone, but Dell has simply gotten down to the hard work of building workable commercial cloud and hyper-scale data center solutions during the past three years.

In fact, Dell was the first major vendor to launch a business unit specifically focused on the commercial cloud. By doing so, the company’s Data Center Solutions (DCS) organization has gained invaluable hands-on expertise about the specialized needs of organizations leveraging cloud technologies for applications including hosting, HPC, Web 2.0, gaming, energy social networking and SaaS. That point likely influenced Microsoft’s 2008 decision to choose Dell as a primary infrastructure partner in developing the Azure platform.

Cool stuff!

Pau for now…


The Dell / Azure Cloud & Appliance

July 12, 2010

Several months ago in the press release that announced our Cloud Solution offerings, there was a particularly cloudy paragraph that talked about Dell’s relationship with Microsoft.  The paragraph ended with the sentence: “Dell and Microsoft will collaborate on the Windows Azure platform, with Dell and Microsoft offering services, and Microsoft continuing to invest in Dell hardware for Windows Azure infrastructure.”  What the heck did that mean?  Well today we can be a bit clearer.

Dell Cloud based on Windows Azure

Earlier this morning at Microsoft’s Worldwide partner conference, the giant of Redmond announced the limited production release of the Windows Azure technology for a select few tech giants.  Dell is one of these and will be taking this technology and creating ourselves a Platform as a service (PaaS) cloud.  We will in turn use this cloud to deliver both public and private cloud services to customers looking to develop and deliver next generation cloud services based on .Net.   This platform will be targeted at enterprise, public, small and medium-sized business customers as well as be used by Dell itself.

But wait, there’s more: Azure in a box

Dell and Microsoft are also working on a Dell-powered Windows Azure platform appliance.  (Don’t let the term “appliance” throw you, you can’t register for this and it really represents 100s or 1000s of servers plus storage and networking).  Dell will be making this turnkey cloud platform available to enterprises to enable them to set up their own PaaS clouds within their organizations.  Dell has a bit of a leg up here since we’ve been working with Microsoft on Azure as the primary infrastructure partner since its launch back in ’08.  We’re simply packaging this “winning combination” and providing it in a turnkey package for internal use by enterprises.

A little context: adding to our cloud portfolio

So how does this fit in with some of the other cloud solutions that we have announced?  At a high-level, Dell is providing cloud solutions to help customers take either an evolutionary approach that makes their existing applications more efficient or a revolutionary approach with new applications written for cloud scale (we actually believe customers will do both).

We have already been working with Microsoft to offer evolutionary cloud services based on Microsoft’s Hyper V platform.  We are now complementing this with a revolutionary Windows Azure appliance.  This turnkey PaaS cloud platform will be in addition to the turnkey PaaS cloud platform that we announced with Joyent.   Whereas the Joyent-based offering, “the Dell cloud solution for web applications” is targeted at folks developing in Java, PHP, Perl, Python, Ruby on Rails etc. the Azure appliance will naturally be targeted at the .Net world.  BTW we also offer solutions based on VMware Redwood/ Spring, EMC Atmos and BMC among others.

Stay tuned for more!

Extra-credit reading

Pau for now…


The Cloud is a marathon — Marten Mickos, Eucalyptus CEO

June 24, 2010

Yesterday at the GigaOM Structure conference here in San Francisco, I ran into Marten Mickos, the recently appointed CEO of Eucalyptus systems.  Eucalyptus is one of the key ingredients in the Ubuntu Enterprise Cloud that is being certified to run on Dell’s PowerEdge C systems as part of our cloud ISV program.

Marten, the former CEO of MySQL took the helm of Eucalyptus about three months ago, and was at Structure both as an attendee and participant, sitting on two panels at this two-day cloud-a-polooza.  At the end of the day-one I got some time with Marten and asked him about his new gig.

Some of the topics Marten tackles:

  • How he made the decision to go to Eucalyptus. (Hint: he asked the question, what’s bigger than Open Source)
  • What is Eucalyptus and whats it based on?
  • How will Marten’s experience at MySQL and Sun help him in his new role at Eucalyptus?
    • MySQL was a disrupter of the old whereas Eucalyptus is an innovator of the new.
    • Sun’s company culture was phenomenal, the technology was phenomenal, the business…um…
  • What Eucalyptus is doing with Canonical and the Ubuntu Enterprise Cloud.
  • What Eucalyptus is focusing on for the next year.

Extra-credit reading:

Pau for now…


Pics from Structure opening festivities

June 24, 2010

Last night, as a lead in to today’s Structure conference, there were two events scheduled here in San Francisco.  The first was a cocktail party hosted by AMD.  The second was a private dinner hosted by GigaOM where each of the sponsors got to send a representative.

Here are some pictures from the two events.

The AMD cocktail party was held in a very cool space. Here it is not long after it opened.

We showed off some of our AMD-based custom systems.

Before long the party was packed (and hot).

I decided to walk from the party to the dinner. The restaurant, the Waterbar, was right under the Bay Bridge.

A bow and arrow along the way.

Pre-dinner drinks.

Pre-dinner remarks with the Bay Bridge as a backdrop. L->R , Mathew Ingram, CEO Paul Walborsky and Stacey Higginbotham

Stay tuned for more stuff from Structure.

Pau for now…


Chocolate covered servers?

June 22, 2010

Is that a heat sink under the Laffy Taffy?

There was a great article about Dell’s Data Center Solutions group that came out a couple of weeks ago.  The article, entitled “Willy Wonka and the Dell Factory,” starts out

If Dell’s cloud server lab is a candy shop for geeks, littered with components and exotic system designs, then Jimmy Pike is the Willy Wonka of servers.

The author then takes the reader on a tour of the top secret Dell Cloud lab explaining,

Like Willy Wonka in the book by Roald Dahl, Pike’s job is to combine ingredients in new and sometimes radical ways. Instead of chocolate and blueberries, his ingredients are chips, fans and motherboards. “Sometimes we bend metal and put boards together with duct tape,” he said…

Servers became “boring” for a while, Pike said, but the requirements of cloud computing have made his job interesting again. “I’ve been doing this for 30 years and I’m having more fun than I’ve ever had,” he said.

And if Jimmy’s having fun, that’s a good thing for everyone. 🙂

Read the whole article here

Want more Jimmy? Check out his data center in a suitcase.

Pau for now…


Cloud Camp Austin 2010

June 17, 2010

Last Thursday over a 100 cloud enthusiasts gathered for Cloud Camp Austin.  The event was held at Pervasive Software‘s headquarters and kicked off after 5PM with munchies and beer.  The event brought in folks all around Austin as well as visitors from exotic areas like upstate New York (the group had been in town for meetings).

Pre camp munchies and drinks as folks assemble.

Dell was one of the sponsors along with IBM, Microsoft, Twilio, Tropo, Redmonk and our  hosts Pervasive .  As always, the event was guided along by Mr. cloud camp, Dave Nielsen.  Being an “unconference,” after a spontaneously assembled “unpanel” who was called upon to answer questions from the audience, the crowd worked together to decide on the topics that would be discussed.

Dave Nielsen explains how this "unpanel" is going to work.

What a difference a year makes

I attended last year’s cloud camp in Austin and I don’t know if its the fact the industry has evolved so much since then or that this year there was a greater percentage of knowledgeable attendees (I suspect a little of both) but this year the topics and questions were much more sophisticated/technical.  As a results the conversations were much more meaty and focused more on “how to” rather than “how do you define.”

All in all a very cool event.

The schedule created on the fly by the attendees.

If you liked Cloud Camp and you like Hadoop, you’ll love

Speaking of camps, Dave Nielsen is taking the camp idea and applying it to the world of Big Data.  The event, which will be held in Santa Clara on June 28, is imaginatively entitled, Big Data Camp Santa Clara.  This unconference  is targeted at users of Hadoop and related technologies and is held the night before Hadoop summit 2010.  So if you’re in the area and Hadoop/Big Data are your thing, check it out.

Pau for now…


Onlive’s gaming cloud powered by custom Dell servers

June 15, 2010

Today at E3, OnLive Inc is kicking off the roll out of its cloud gaming service.  OnLive, whose motto is “Just Play,” leverages broadband and the cloud to deliver on-demand gaming titles directly to users’ PCs, Macs or even TVs.

Square Enix's Batman: Arkham Asylum -- one of the first batch of games available from OnLive

This new service could prove to be a real “game changer.”  As Dell Data Center Solutions director Andy Rhodes, helping with the launch at E3 explains, “I see it as the start as of a move of processing power from consoles into data centers…from the center of the living room into the data center.”

Building the OnLive Cloud

So what’s behind this gaming cloud, Dell of course 🙂 (well, at least a good part of it).  The Dell Data Center Solutions (DCS) group began working with OnLive a few years back to design and build custom-tailored systems for the OnLive platform.

The problem statement for the solution was to create an infrastructure that supported the streaming of HD-quality video game over the internet, drove down the total cost of ownership and allowed OnLive to scale quickly as the company grows.  The DCS team worked directly with the folks from OnLive to architect an ultra-dense and uber-power efficient infrastructure solution designed around OnLive’s super secret hardware components and software.  Thousand of these customized systems are now deployed at OnLive data centers around the country.

Plug and Play Racks

By leveraging the DCS supply chain and fulfillment chops, Dell is able to deliver pre-integrated fully racked solutions that can be hooked up and powered on within hours of arriving at an OnLive data center.  Going forward Dell will continue to work with OnLive to create new infrastructure architectures for future generations of the service.

Game on! (and on, and on and on)

Electronic Arts' Mass Attack 2: available via OnLive

Who’s on First?

The initial batch of 23 titles available to OnLive subscribers include:

  • Assassin’s Creed II (Ubisoft)
  • Batman: Arkham Asylum (Square-Enix)
  • Borderlands (Take Two Interactive Entertainment)
  • Dragon Age: Origins (Electronic Arts)
  • Just Cause 2 (Square-Enix)
  • Mass Effect 2 (Electronic Arts)
  • NBA 2K10 (Take Two Interactive Entertainment)
  • Prince of Persia: The Forgotten Sands (Ubisoft)
  • Tom Clancy’s Splinter Cell Conviction (Ubisoft)

Extra Credit reading

Pau for now…


An overview of the worldwide gaming market

June 11, 2010

Dell’s Data Center Solutions (DCS) group has both custom offerings and, as we announced a couple of months ago, a new line of systems and solutions targeted at a wider audience.

One the the key markets we are looking at for our new line is gaming.  To get up to speed on the market I took a look at the report that the PC gaming alliance put together for its members.  It was a very cool read.  Here a few things I learned:

Some fun facts to know and tell:

  • Last year the global PC game software market was just over $13B while the global console software market was nearly $20B.
  • The revenue from PC games  is expected to pass the revenue from console software in 2012.
  • Last year China was the leading country for PC game revenue, 99+% which came from non-retail sources e.g. subscriptions and digital distribution.
  • Worldwide piracy is decreasing as PC games move from package software to a service based business where users pay per usage.
  • On a revenue basis the majority or leading PC game companies come from China or South Korea.
  • Biggest growth last year came from the free-to-play (F2P) games where delivery of these games on social networks like Zynga’s Farmville on Facebook took off.

Stay tuned…

Dell has publicly been a big player in the PC gaming market through our line of Alienware systems (in fact we had an announcement yesterday).  Where we have been a lot quieter however is talking about how our Data Center Solutions (DCS) group fits in.   Next week at E3 we will be making an announcement to explain just what we’ve been up to.  So stay tuned next week and see how DCS “plays” in gaming 🙂

Pau for now….


Talking to Joyent’s CTO and co-founder: Jason Hoffman

June 3, 2010

When I was out in the Bay Area for our launch a while back I stopped by Joyent‘s new headquarters (I actually visited them on their very first day in their new digs). I chatted with CTO Jason Hoffman about his background, what Joyent’s all about and what they are doing with Dell.  Take a listen:

Some of the topics Jason tackles

  • What Joyent does (hint: they provide virtual datacenters)
  • Joyent customers: they range from the top facebook applications, on line media companies, movie, music and tv studios, online retailers…
  • Your next computer is the data center — which needs operating environment, an open API and a good set of developer tools.
  • How Jason got to where he is: via a Doctorate in pathology where he was an end consumer of compute.  He realized that a lot of the efficiencies that they had developed in his field could be applied to a hosting environment.
  • Dell as Joyent’s “private cloud arm:” Joyent software running on Dell’s hardware where Dell can come in and set up the entire environment enabling departments within companies to act as service providers within their organizations.

To put it in perspective…

And since we’re talking about Joyent and Dell and Joyent working together I thought I would include this excerpt from a post that Redmonk analyst Stephen O’Grady recently wrote about the private cloud:

At the present time, however, most of that which we call Platform-as-a-Service – the layer currently serving as middleware – is public cloud only. The PaaS fabrics tend to be proprietary and not available for private consumption. Salesforce.com, for example, doesn’t let you replicate Force.com on your servers. Ditto for Google App Engine. Microsoft Azure features may be trickling back into Windows, but you’re not going to be running Azure in your local datacenter. This is why Dell’s distribution of Joyent’s cloud software came as such a surprise to many; you just don’t see these fabrics being made available locally.

Extra-credit reading

  • Survey Shows More Than Half of Dynamic Language Developers Are Looking To Build Cloud-based Applications in Next Year

Pau for now…


US Military Forges ahead into Cloud & Open Source

May 28, 2010

Following on my entry from yesterday,  here is something pretty cool I learned while doing research on what’s happening in public sector cloud computing:  Forge.mil

From their FAQ they explain:

Forge.mil is a DISA-led activity designed to improve the ability of the U.S. Department of Defense to rapidly deliver dependable software, services and systems in support of net-centric operations and warfare.

What really surprised me was the emphasis they place on “early and continuous collaboration” and their embracing of open source software.  In fact, in an October 16 memo, the DoD’s deputy CIO, reiterated the fact that open-source software “meets the definition of ‘commercial computer software,’” and can “provide advantages” given DOD’s need to “update its software-based capabilities faster than ever.” (source: Wyatt Kash’s article)

Here are some high level stats on Forge.mil’s usage since it started last year:

  • 4,000 Registered users
  • 170 hosted projects
  • Produced more than 500 software releases

The service itself is broken into two cloud-based offerings — SoftwareForge and ProjectForge.  Here are the highlights:

Software Forge
  • A free service, open and community source DoD software
  • Default is open view access
Project Forge
  • For fee, non-community source
  • Default is private
  • Originally limited to Army & Navy but on Jan 13 it was made available to other military branches and DoD civilian employees and contractors
  • Two flavors:
    • On Demand’: multitenant, good for 100 users
    • Private’: single tenant, can brand, 100+ users

Who knew?!

Pau for now…


Federal Cloud Computing, two steps forward?

May 27, 2010

Over the past week I have presented Dell’s thoughts and capabilities around cloud computing to several different groups from the U.S. military.  In preparation for these talks I did some research into what’s happening in the wild and wonderful world of federal cloud computing.  Here are a couple things that I found particularly interesting:

Psych!

In the past I have used the General Services Administration’s cloud RFQ  issued last July, as an example of how the government is boldly sallying forth into the cloud.  Turns out that in February they withdrew the RFQ saying basically that too much had changed since the RFQ was issued and that they need to regroup and get a solid view of the customer and market landscape before writing a new one.

Speaking of snags, Apps.gov which was launched last September as “an online technology supermarket for federal agencies” has not been the success that Federal CIO Vivek Kundra had hoped for.  According to the WSJ, “concerns about compliance with security requirements and terms of service have prompted many agencies to bypass Apps.gov.”

But wait, there’s more

The above being said, the US government has a ton of cloud projects its working on.   To get smart on the littany of efforts, check out the State of Public Sector Cloud Computing report that Vivek Kundra issued last week.

Stay tuned for my next entry that will talk about how the Military is “forging” ahead.

Pau for now…


Big Data in the Windy City

May 20, 2010

The Aqua building, catty corner from my hotel

Last Tuesday and Wednesday, I attended the TDWI (The Data Warehouse Institute) world conference in Chicago.  The show was a mix of courses and exhibit space.

I went to learn about the BI/Data warehousing segment and scout in preparation for the next conference in August.

Why BI?

My interest in the space comes from the fact that two of the three first partners in our Cloud Partner program are in the Data Warehousing and analytics space: Aster Data and Greenplum.  Both these partners are leveraging highly scaled-out architectures to crunch data.

While there, besides checking out the 24 companies on the exhibit floor, I attended three half-day classes: Developing your BI tool strategy, Cool BI, the latest innovations, Extending BI to support online marketing and Web 2.0.

For other newbies like myself, here are some notes from the first course.

My Notes: The layers of the BI Lifecycle stack

BI Suites:

  • What they do : Query, report, analyze, visualize, alert (front end to the chain)
  • The Big 4:  IBM (Cognos), SAP (Business Objects), Oracle (Hyperion), Microsoft
    • They all bought small players who excelled in the space
    • Usually offer the suites as part of a complete BI lifecycle stack
    • Two of the remaining independents are Microstrategy and SAS

Data Management

  • Data warehouse/mart databases and storage
  • Usually in a RDBMS but also in a dedicated OLAP database
  • Examples: Aster Data, Greenplum, Neteeza, Teradata

Data Integration (aka ETL)

  • They extract, transform and load info from the layer below into the layer above.
  • Examples: Informatica

Operational Apps/Systems

  • Planning, ERP, CRM etc
  • Orders, Invoices, Shipping, Web clicks

Extra-credit reading

Pau for now…


NetworkWorld Review of Ubuntu Enterprise Cloud

May 18, 2010

Tom Henderson and Brendan Allen of ExtremeLabs published a great walk-thru of the Ubuntu Enterprise Cloud (UEC) last week in NetworkWorld.  Canonical, the commercial sponsor behind Ubuntu, is one the first members of our Cloud Partner Program and we will soon be offering UEC running on top of our PowerEdge C line accompanied by reference architectures.

If you’re not familiar with UEC, which leverages the open source Eucalyptus private cloud platform, here is a quick backgrounder:

Basically, Ubuntu Enterprise Cloud can be deployed on internal hardware to run job/batch applications. The idea is to initially allocate storage, then rapidly build multiple virtual machines to process data, collect the data, then tear down the infrastructure for re-use by a subsequent purpose.

Ubuntu Enterprise Cloud provides internal cloud control methods that closely mime what can be done on Amazon’s public cloud infrastructure. Its tools can be used to process recurring jobs or one-shot distributed applications, like DNA analysis, video rendering, or database table reformatting/reindexing.

Walk this way

The Review, which is a concise 3 and a half pages, steps you through:

  • Getting started
  • Installation*
  • Setup/configuration
  • Image Bundles
  • Usage/Monitoring

*My favorite line from this section is: “Installation was very simple; we inserted the Ubuntu Server CD, selected Ubuntu Enterprise Cloud, and drank energy drinks.”

If you’re interested in learning about UEC this article is a great place to start.

Extra-credit reading

If the above whets your appetite, you may want to dig into the following:

(The last 3 items I grabbed from Dustin’s Blog)

Pau for now…


Datawarehouser Greenplum — Talking to President and Founder, Scott Yara

May 7, 2010

When I was out in the Bay Area for our launch I stopped by data warehouse and analytics player Greenplum.  Greenplum is one of the first three members in our Cloud Partner program (the other two are Canonical and Aster Data.)  I sat down with Greenplum’s President and founder Scott Yara to talk about the company and where they’re going:

Some to the topics Scott tackles:

  • Whats happening in the world of data.
  • How Greenplum began with the open source PostgreSQL database platform and over the last 7-8 years have refactored it and built a massively parallel database kernel engine.
  • How it works:  Greenplum takes the data and physically distributes it across all the Database segments and operates on the data in parallel.  This parallel approach allows Greenplum to process data 10-100x faster than conventional databases.
  • Who is using it: Skype, Fox Interactive, NTT docomo, Deutsche Bank, retailers, large healthcare companies.
  • The enterprise data cloud initiative – Setting a new type of analytics infrastructure that takes advantage of virtualization and the latest in general purpose and multi-core systems and is centered around self-service principles.
  • While a lot of folks are excited about writing apps to the iPhone, the platform that Scott and crew gest really excited about writing to are 2 socket Nehalem servers with a bunch of disk drives behind them.
  • How someone would go about getting started with Greenplum.

Extra Credit reading:

Pau for now…


Dell’s “Custom Tailor” to Internet Stars turns three — Expands to serve the next 1000

May 4, 2010

A few weeks ago, Dell’s Data Center Solution (DCS) group celebrated its third birthday.  This team — which acts as a “custom tailor” to some of the world’s biggest internet stars — services businesses who require a vast amount of computing horsepower to run data-intensive applications.  In addition to major internet players, DCS’s customers include financial services organizations, national government agencies, institutional universities, laboratory environments and energy producers.

At the recent Dell enterprise launch we announced the expanding of our DCS efforts beyond the 30 customers we have been working with in our “classic” business.   Leveraging the knowledge and experience we have gained working with the biggest of the big we have created a portfolio of products and solutions to address “the next 1000.”

To get an idea of where DCS has come from and where its going check out the video above that DCS marketing director Andy Rhodes created pre-launch to help our sales force.

Some of the topics Andy tackles:

  • What is “hyper scale,” who uses it and how big is the market?
  • Why was DCS established in the first place and the high-touch nature of its customer relationships.
  • How do we take what we’ve learned working with a small group and take it to the next 1000 customers?
  • How does the hyper scale inspired PowerEdge C line differ from the traditional PowerEdge line and what markets are they targeted at.

Extra credit Reading

Pau for now…


Cloud Pioneer, Salesforce.com

April 29, 2010

Last month when I was out in the Bay Area for our launch, I stopped by the offices of salesforce.com.  I visited with some folks that I used to work with in a past life and then grabbed some time with Salesforce’s VP of product marketing, Sean Whiteley.

Here is what Sean had to say:

Some of the topics Sean tackles.

  • The idea behind salesforce.com (SFDC):  In 1999 founders Marc Benioff and Parker Harris looked at Amazon and wondered why businesses couldn’t manage and get insight into their customers with the same ease as they interact with their favorite website.
  • Given that SFDC is built on a model of “multitenancy” how do they address security concerns when they are brought up.
  • Force.com: what it is and how it came about.  Also the advent of AppExchange, where you can shop for applications that let you extend the cloud applications that you use to run your business.
  • What salesforce.com and Dell are doing together to address small and medium businesses:  providing a business in a box, helping organizations focus on their core business rather than IT.

Pau for now…