Over the past three years Dell’s Data Center Solutions group has been designing custom microservers for a select group of web hosters. The first generation allowed one of France’s largest hosters, Online.net to enter a new market and gain double digit market share. The second generation brought additional capabilities to the original design along with greater performance.
A few months ago we announced that we were taking our microserver designs beyond our custom clients and making these systems available to a wider audience. Last month the AMD-based PowerEdge C5125 microserver became available and yesterday the Intel-based PowerEdge C5220 microserver made its debut. Both are ultra-dense 3U systems that pack up to twelve individual servers into one enclosure.
To get a great overview of both the 12 sled and 8 sled versions of the new C5220 system, let product manager Deania Davidson take you on a quick tour:
Target use-cases and environments
Hosting applications such as dedicated, virtualized, shared, static content, and cloud hosting
Web 2.0 applications such as front-end web servers
Power, space, weight and performance constrained data center environments such as co-los and large public organizations such as universities, and government agencies
The day after I attended the Hadoop Summit I paid a visit to Candace Denton and the Plug and Play Tech Center in Sunnyvale. Plug and Play, which occupies the space that once housed Phillips Semiconductor, describes itself as a community of over 300 technology startup companies in the areas of Web 2.0, mobile, digital media, software systems (SaaS), semiconductor and telecom verticals.
Candace, who heads up Business Development at Plug and Play, gave myself and and a couple of Dell compadres a tour of their Sunnyvale facilities. After the tour we sat down in the lobby and she gave a brief synopsis of Plug and Play, where they’ve come from and what they do.
Some of the ground Candace covers
Plug and Play’s five pillars of resources
(0:22) Funding (they have their own funding arm)
(1:08) Facilities (over 300,000 sq ft in the Valley)
(1:23) Corporate relationships
(1:36) Mentorship and advisory service
(1:41) Networking events (hold up to 120 events a year)
(2:28) A few of the success stories that have exited Plug and Play
(3:43) How their founder Saeed Amidi got into this business: from leasing office space to a fledgling Google, to a bottled water company to setting up their current facilities
Back in April, Dell announced that it was making a billion dollar investment to further drive its evolution from a systems to a services company. Specifically we talked about delivering a raft of new solutions, launching 10 cloud data centers around the world and building out a global network of solutions centers.
The solution centers are customer facing facilities that will act as a “living lab” providing an environment and the support for customers to architect, build and test proof of concepts involving Dell products, services and solutions. The centers will also support solution integration, technical briefings and validation and ISV certification to meet regional requirements. Last month the first of these centers were opened in Limerick, Ireland and Shanghai.
This morning the first US solutions center opened at Dell HQ here in Austin. Here is a short montage of the opening ceremonies.
Contents
The crowd
(0:16) Opening Remarks: Jan Uhrich VP of Dell Services
(1:12) Steve Schuckenbrock, President, Dell Services — How the center fits in Dell’s strategy and transformation
(3:51) Paul Bell, President Dell Public and Large Enterprise — A couple of customer examples of large organizations who had early access to the facility and what they accomplished.
(6:26) William Collins, Head of the Austin Solution Center — The next two centers to open in the U.S.
(7:3o) Ribbon cutting
After the ribbon cutting I took the tour to see some of the solutions on display. I’ll be posting those in the days to come. Stay tuned!
Here is the final entry in my interview series from the Hadoop Summit.
The night before the summit, I was impressed when I heard Ken Krugler speak at the BigDataCamp unconference. Turns out Ken has been a part of the Hadoop scene even before there was a Hadoop, his 2005 start-up Krugle utilized Nutch which split and evolved into Hadoop. He now runs a Hadoop consulting practice, Bixo labs, and offers training.
I ran into Ken the next day at the summit and sat down with him to get his thoughts on Hadoop and the ecosystem around it.
Some of the ground Ken covers
How he first began using Hadoop many moons ago
(0:53) How Hadoop has crossed the chasm over the last half decade
(1:53) The classes he teaches, one very technical and the other an intro class
(2:23) What the heck is Hadoop anyway?
(3:30) What trends Ken has seen recently in the Hadoop world (the rise of the fat node)
The next in my series of videos from the Hadoop Summit features Cloudera‘s Vice President of product, Charles Zedlewski. If you’re not familiar with Cloudera you can think of them as the Red Hat of Hadoop world.
I sat down with Charles to learn more about Cloudera, what they do and where they came from.
Some of the ground Charles covers:
Cloudera’s founding, what its original goals and vision were and where its founders came from.
(1:35) What Cloudera does for customers 1) packages Hadoop and 2) helps them run it in production environments.
(3:27) What channels Cloudera leverages and where they play in the ecosystem
(4:11) Charles’ thoughts on the Yahoo spin-out Hortonworks and how it might affect Cloudera.
I’m now back from vacation and am continuing with my series of videos from the Hadoop Summit. The one-day summit, which was very well attended, was held in Santa Clara the last week of June. One of the two Platinum sponsors was MapR technologies. MapR are particulaly interesting since they have taken a different approach to productizing Hadoop than the current leader Cloudera.
I got some time with their CEO and co-founder John Schroeder to learn more about MapR:
Some of the ground John covers
The announcements they made at the event
(0:16) How John got the idea to start MapR: what tech trends he was seeing and what customer problems was he learning about.
(1:43) How MapR’s approach to Hadoop differs from Cloudera (and Hortonworks)
(3:49) How the Hadoop community is growing, both with regards to Apache and the commercial entities that are developing, and the importance of this growth.
Yesterday I attended the Hadoop Summit down in Santa Clara. The one-day event featured a morning of general sessions followed by three tracks of break outs in the afternoon. The event also featured displays by several dozen vendors.
The big topic of the day was Hortonworks, a Yahoo! spin-out that had been announced the day before. The company, which will officially come into being next month will be made up of 25 core Hadoop engineers from Yahoo! Leading this new venture as its CEO is Yahoo! veteran and until this week VP of Hadoop engineering, Eric Baldeschwieler.
In the afternoon I was able to get some time with Eric and learn more about his new gig.
Some of the ground Eric covers
What is Hortonworks and what are its goals?
(0:46) Who is the technical team that will be making up the new venture
(1:48) Their president Rob Bearden, his open source experience and the business expertise he brings
(2:24) Their customers
(2:44) Which Hadoop engineers will remain at Yahoo!
(3:37) The symbiotic relationship Hortonworks and Yahoo! will have and how they will help one another
Extra-credit reading
Press release: Yahoo! and Benchmark Capital to Form Hortonworks to Increase Investment in Hadoop Technology and Accelerate Innovation and Adoption
To close out my series of interviews from last week’s Structure conference in San Francisco, below is the chat I had with Data Center Knowledge‘s founder and editor, Rich Miller. Last week’s event was the 4th Structure conference that Rich attended and I got his thoughts on some of the hot topics.
Some of the ground Rich covers
How the discussion of cloud has evolved over the last four years
(1:21) Rich’s thoughts on OpenFlow and the networking space
(2:25) Reflections on the next-gen server/chip discussion and the companies on the panel: SeaMicro, Tilera, Calxeda and AMD
(4:25) Facebook’s OpenCompute project and the new openess in data center design
Last week on Day two of Structure the morning sessions ended with an interesting discussion moderated by James Urquhart. The session was entitled “DevOps – Reinventing the Developers Role in the Cloud Age” and featured Luke Kanies – CEO, Puppet Labs and Jesse Robbins – Co-Founder and CEO, Opscode.
After lunch I ran into Jesse and got him to sit down with me and provide some more insight into DevOps as well as explain what Opscode was doing with project Crowbar.
Some of the ground Jesse covers
(0:21) What is DevOps
(1:00) The shift that happens between developers and operations. Writing code and getting it into production faster and how it shifts responsibilities between the two groups.
(2:52) Who are the prime targets for DevOps and how has this changed over time.
How DevOps began in web shops who needed to do things differently than legacy-bound enterprises.
How enterprises faced with greenfield opportunities are now embracing devops
(5:36) The crowbar installer which employs Opscode’s Chef and allows the rapid provisioning of an OpenStack cloud.
Last Thursday at Structure I ran into a couple of former Sun compadres who have started their own company in the cloud space: Cumulogic. Cumulogic is PaaS for developing Java applications and boasts the father of Java James Gosling and former Sun CIO Bill Vass as the leaders of its technical advisory board.
I got some time with Cumulogic’s CEO Rajesh Ramchandani and learned a bit about their new venture.
Some of the ground Rajesh covers:
Targeting enterprise Java PaaS for federated clouds
Announced the company in January and are conducting user betas now
Seeing early adopters in financial services and healthcare
Currently available as a public cloud via Amazon
Will have a release soon that will allow users to set up a private cloud within an enterprise on environments like vmware, cloud.com or eucalyptus.
If you’re looking for someone who is totaly connected in the field of data centers, look no further that Green Data center blogger Dave Ohara. I met Dave, who is a veteran of HP, Apple and Microsoft, last year at Gartner’s data center summit and learned a lot from him at a our first meeting. I was therefore very glad to run into him yesterday at the Structure conference.
I got some time with him today and asked Dave a few burning data center questions that have been on my mind.
Some of the ground Dave covers
Why all of a sudden are companies first foray’s into data centers becoming hot news?
The rise of the marketing and positioning of data centers
What are some of the pitfalls of moving from the public cloud or co-los to your own data center (hint: huge learning curve)
Your data center design should reflect your business model
Today was second day of the two-day Structure conference here in San Francisco. Cloud was the topic du jours with heavy referencing of big data and concepts and projects such as OpenFlow, Open Compute and OpenStack. The format consisted mainly of moderated panels seated in comfy chairs with break out sessions scheduled a couple of times during the day.
While some of the panels and speakers were quite enlightening, I find the true benefit of a show like Structure comes from the networking and hallway conversations that occur. One such conversation was one I had with Jonathan Bryce of Rackspace about the incubation program they have just launched for OpenStack.
Some of the ground Jonathan covers:
Dealing with the question of how to expand OpenStack and include new projects
The initial three core projects: Compute, Object Storage and Image Service
The first two projects that have been approved for incubation: a dashboard and “keystone”
Cybera, a Canadian not-for-profit recently selected OpenStack along with Dell systems to build out their Infrastructure as a Service cloud. The organization, which is based in Alberta, “collaborates with public and private sector partners to accelerate research and product development that meets the needs of today’s society.”
Most recently Cybera used OpenStack to build out a cloud for CANARIE’s (Canada’s Advanced Research and Innovation Network) DAIR project.
To start with, you’ll need hardware. If you have the time and inclination, the best thing to do might be to ask Rackspace Cloud Builders for some help spec’ing out the hardware for OpenStack. This is the route that Cybera went and we got some badly needed advice. Since you might not be able to go that route, I’ll tell you what we know.
At the end of the day we went with Dell, based on the Cloud Builders’ advice and our own due diligence. If you aren’t aware of it yet, Dell is supporting OpenStack in a big way. They have a number of pages dedicated to it here. There’s also a whitepaper that discusses hardware and network for OpenStack, if you feel like filling out the form.
We ordered four different types of servers (aka nodes). A management node (nova-api, nova-network, nova-scheduler, nova-objectstore), compute nodes (nova-compute, nova-volume), a proxy node (swift-proxy-server) and storage nodes (swift-object-*, swift-container-*, swift-account-*). All nodes were contained in the Dell C6100 chassis. Here are the specs:
Processor
Sockets
Cores
Threads
RAM
Disk
Management
E5620
2
8
16
24
8 x 300 GB
Compute
X5650
2
12
24
96
6 x 500 GB
Proxy
E5620
2
8
16
24
4 x 300 GB
Storage
E5620
2
8
16
24
6 x 2 TB
Great to see people picking up OpenStack and running with it!
Last week Dell was out in force at the Cloud Computing Expo in New York as the event’s diamond sponsor. Besides the Keynote that President of Dell services Steve Schuckenbrock delivered, Dell also gave, or participated in 11 other talks.
I also gave one the talks and mine focused on the revolutionary approach to the cloud and talked about how this approach was setting a new bar for IT efficiency.
Here’s the deck:
(If the embedded deck doesn’t appear above, you can go to it directly on slideshare).
Talking with Press and Analysts
At the event I also met with press and analysts. One of the things I find helpful in explaining Dell’s strategy and approach to the cloud is to sketch it out for someone real time. I guess analysts Chris Gaun and Tony Iams of Ideas International found it helpful since they both tweeted a picture of it :).
Besides analysts I also met with several individuals from the press. Mark Bilger, CTO of Dell services and I met with Michael Vizard of IT Business Edge and it resulted in the following article Cloud Computing Starts to Get a Little Foggy.
Additionally, to support the event and Dell’s cloud efforts going forward, Dell launched the Dell in the Clouds site. It’s pretty cool, you may just want to check it out.
Extra-credit reading (all my posts from Cloud Expo):
Last week at Cloud Expo in New York, Dell commissioned an independent third party, Marketing Solutions Corporation, to conduct a survey of IT professionals who were attending the show. The survey, which excluded IT members of technology providers, posed a series of cloud related questions and asked the IT professionals to answer both from their point of view and from the point of view of their non-technical senior management.
The results are in
Not surprisingly, the 223 IT respondents were split with 47% seeing cloud as an extension of long-term trends toward remote networks and virtualization while 37% believed it was a radically new way to think about their own IT function. When answering how they thought senior managers would view the cloud, 37% felt management saw cloud computing as having “immense potential.”
Interestingly, while 66% of the respondents said their IT department would both advocate and benefit from cloud-based solutions, most didn’t expect similar support or optimism from other departments. The next closest function was customer service which only 26% of the respondents felt would see cloud with equal optimism and marketing and sales with 25%.
To learn more about the survey and the conclusions drawn, see the release that went out Friday.
Earlier this week at CloudExpo, I talked to both Peder Ulander of Cloud.com and Rich Wolski of Eucalyptus about their involvement with RightScale‘s myCloud solution. Yesterday I thought I would go straight to the source so I got a hold of RightScale’s VP of business development, Josh Fraser.
Besides the myCloud announcement, Josh also told me about their work with Zynga. Zynga, as detailed in a recent InformationWeek article, has a hybrid cloud model. Zynga uses the Amazon public cloud to test new games and then if the game is a hit and when its demand has leveled off, they pull it back into their Z-cloud private cloud. RightScale manages across the two clouds.
Some of the ground Josh covers
What is RightScale
[0:26] Their myCloud announcement, widening their focus beyond public clouds to include private and hybrid. Who they’re partnering with, what myCloud is composed of and their free version.
[2:38] Working with Zynga, managing across both Zynga’s private Z-cloud and the public cloud they use at Amazon.
[4:09] Working with Amdocs who is running enterprise grade workloads in a private cloud managed by RightScale.
Yesterday at Cloud Expo I bumped in to Dr. Rich Wolski, CTO and co-founder of Cloud player, Eucalyptus. It had been a while since we had last talked so I grabbed some time with him and got him to give me the skinny:
Some of the ground Rich covers:
Eucalyptus’s major release which is coming out in the next 4 weeks
[0:40] The RightScale myCloud integration that they announced yesterday (linking Eucalyptus private clouds with various public clouds)
[2:01] Eucalyptus’s relationship with Canonical and how their interests are diverging
[3:15] Where specifically Eucalyptus is targeted
[4:25] What are some of their goals and product features they’d like to add over the next year
Today when I was walking the floor at the Cloud Expo here in New York, I ran into fellow Austinite Dustin Kirkland. Dustin is the manager for systems integration team for Ubuntu. I got Dustin to give me the low down on the most recent UDS (Ubuntu Developer Summit) that concluded a few weeks ago in Budapest:
Some of the ground that Dustin covers
The big areas of focus on the server side coming out of Budapest
Getting behind OpenStack as the Ubuntu IaaS platform
[1:09] The pioneering work they’ve done with Eucalyptus and how its use case differs from that of OpenStack
[2:05] The Ensemble project, a service orchestration framework for the cloud which is the brainchild of Mark Shuttleworth.
[3:59] Ubuntu Orchestra for cloud installation, provisioning and configuration management (using Puppet)
Last night at Cloud Expo, I got some time with Cloud.com‘s CMO Peder Ulander to learn how they are working with two key partners, OpenStack and RightScale. Peder told me how OpenStack is a key relationship for Cloud.com and gave me a quick overview of today’s announcement that Cloud.com is powering RightScale’s myCloud Private Cloud offering:
Some of the ground Peder covers:
Open Stack: The development work Cloud.com is doing on OpenStack; their work on a Swift implementation; and how Cloud.com and OpenStack might play together going foward
[1:25] RightScale: The myCloud announcement and the advantages it brings to enterprises. How the two companies are doing joint development and joint marketing.
Tonight at the opening reception for Cloud Expo, I ran into Peder Ulander, CMO of Cloud.com. We found a quiet spot off the show floor and I got him to tell all about Cloud.com, where they’ve been and where they’re going.
Some of the ground Peder covers
What is cloud.com, where does it play in the cloud ecosystem and what does it help customers do?
[01:22] Who are some of Cloud.com’s customers (hint: Nokia, Zynga, Korean Telecom…) and in what industries are they in?
[03:25] Where did the idea for cloud.com come from and what experience did the founders leverage in creating it?