Posts tagged "Data center"

Ultra-efficient Datacenters And Servers, But In The Future Will We Need Them?

Sometimes the solution to a problem involves thinking completely outside of the box. Case in point the rapidly increasing quantity of data that exists, recent studies suggest that by 2017 there will be 3.6 billion people online, 19 billion global machine-to-machine network connections and 1.4 zettabytes of information being generated online. At the same time, Jonathan Koomey, a research fellow at the Steyer-Taylor Center for Energy Policy and Finance at Stanford University, recently told the “How Green is the Internet?” symposium held by Google that the internet uses around 10% of the world’s electricity (up 25% in a little over a decade). Put this ever increasing deluge of information, and the corresponding increase in energy consumption together and you have a real cause of concern for those who operate data centers at scale – indeed Gartner research recently identified skyrocketing energy demands and costs as the #1 concern for data center operators globally.

So. What is the solution? Traditional approaches have been to make data centers as efficient as possible. Data center operators have invested heavily in creating highly efficient facilities – I had the pleasure of visiting one of the best examples of ultra efficiency last year at the SuperNAP facility in Las Vegas. The other approach has been at the server level – in recent years there has been a plethora of low power server developments, just the other week Servergy release a new class of servers boasting the highest performance-per-watt available. According to the company their new server line saves up to 16X in server energy and space costs over traditional systems.

That’s all well and good, but what happens when you think outside the square? When you have the ability to not look at commercial imperatives but rather look at a problem space conceptually? When you’re not tied to a corporate entity but are immersed in the pure research atmosphere of a university? Researchers at Cambridge University had that opportunity an came up with something fascinating. A future internet infrastructure that doesn’t need servers to function.

As covered over on GigaOm, researchers have found a way to deliver content without the need for servers to house and deliver the data This move away from centralized computing is part of a project funded by the European Union, Pursuit. the so=called Pursuit internet is similar to the approach used by BitTorrent or Skype – information is shared directly between individual users rather than through a third party server. The concept replicates data in multiple locations to increase efficiency and make the network as a whole more robust.

The research team have already created a proof of concept of Pursuit – given current concerns about NSA access to centralized data, the ideas raised by Pursuit obviously resonate with those who keep a watching eye on privacy. Of course Pursuit doesn’t answer the concerns of those who suspect there is widespread capturing of data on the public internet The technical manager for Pursuit, Dirk Trossen responded to this concern saying that:

Similar to today, if you designed the deployment appropriately, censorship and surveillance would become very difficult (using encryption, ‘hiding’ behind labels without using meaningful names or changing the name to label association rapidly. However, censorship and surveillance can also become easy by centralising the main components. All this, however, is similar to today’s internet. The surveillance unearthed by Snowden was enabled at large by the centralisation of main components of today’s internet (in U.S. jurisdiction). There are certain architectural measures one can do to circumvent that but it’s hard nonetheless. I don’t think that it would be much different in a Pursuit world, if you don’t have the societal push for reduced surveillance. In short: censorship and surveillance in a policy/society problem

Pursuit aims to replace the protocols used today on the public internet. This is in contrast to a project out of PARC, the “content centric network” which would run alongside TCP/IP. For those who raise questions about the sheer volume of data needing to be stored and serve, Trossen points out the fact that the massive increase in the numbers of connected devices combined with Pursuit’s ability to handle files split into multiple “chunks” means that this isn’t a major barrier to adoption.

Of course there are plenty of other barriers to Pursuit gaining a toehold, in particular the fact that some of the biggest sources of stored data – Google, Facebook and even Twitter, make money specifically out of storing, analyzing and acting upon user data. Any move to distributed storage out of their control would likely be met with resistance.

Still, as a pure research project, Pursuit is very interesting. I don’t think the data center operators (or the efficient server vendors for that matter) have anything to worry about just yet though

Enhanced by Zemanta

Viewed 21068 times by 4679 viewers

Be the first to comment - What do you think?
Posted by plates55 - December 16, 2013 at 9:32 am

Categories: Trends   Tags: , , , , , , ,

Windows Server 2012 Certification

The Microsoft Certified Solutions Expert (MCSE): Server Infrastructure certification validates your ability to build comprehensive server infrastructure solutions. Show that you have the skills needed to run a highly efficient and modern data center, with expertise in identity management, systems management, virtualization, storage, and networking.

MCSA: Windows Server 2012plusExam 413 Beta: Designing and Implementing a Server InfrastructureplusExam 414 Beta: Implementing an Advanced Server Infrastructure

equalsMCSE: Server Infrastructure



Need to upgrade your certification?

Upgradeable certifications

Exam 417: Upgrading Your Skills to MCSA Windows Server 2012plusExam 413 Beta: Designing and Implementing a Server InfrastructureplusExam 414 Beta: Implementing an Advanced Server Infrastructure

equalsMCSE: Server Infrastructure


Full site:

Enhanced by Zemanta

Viewed 25170 times by 4400 viewers

Be the first to comment - What do you think?
Posted by plates55 - September 8, 2012 at 1:31 pm

Categories: Uncategorized   Tags: , , , , , , ,

System Center 2012 Gets Generous Licensing Changes

The soon to be released System Center 2012 product suite will get a huge licensing overhaul.

Where the current System Center 2010 suite contains 110(!) licensing items, SC 2012 will only have 2, yep that’s right only 2 choices of SKU’s and that is for the whole suite of SC products!

That’s not all, these licenses also include SQL licenses if a SC 2012 component requires it.

That is still not all, besides Opalis and AVICode now being fully integrated in 2012 a new SC product is included in 2012: App Controller and…..

Forefront endpoint protection (anti-malware)!


So that is 8 System Center products included in 2 license forms:


An OSE is Operating System Environment.

Each license is for a dual processor machine, that is 2 physical processors, not cores. So maybe use the term socket to avoid the proc/core confusion Smile.

Standard SKU 1,300 USD per 2 proc

Datacenter SKU 3,600 USD per 2 proc (socket), includes SQL runtime



And there is still more, the licensing transition plan is also very generous!


Also available in combination of the ECI licensing, with minimum of 25 initial purchase.


Now compare the new licensing with Vmware:



Some examples which license is most economical


More system center 2012 news later this week

Enhanced by Zemanta

Viewed 22886 times by 4787 viewers

Be the first to comment - What do you think?
Posted by plates55 - January 19, 2012 at 8:26 am

Categories: Microsoft   Tags: , , , , , , ,

Instant datacenter HP POD 240a

Here at HP Discover on the show floor is a “slice” of their container based datacenter solutions.

It is not the sea container we saw back a few years ago. You can order these and get them complete pre-racked pre-installed with servers. Building a datacenter this way saves a lot of time, a lot…

The HP POD 240a delivers the ultimate in data center efficiency with 88 percent faster data center deployments, reduces capital expenditures by up to 75 percent, and reduces energy waste as much as 95 percent as compared to traditional brick and mortar data centers.

The HP POD can ship with fully integrated and tested IT from an HP factory slashing the time for data center build-out. It comes to you as part of a complete data center solution, with services available for strategy and site planning to innovative products and comprehensive global support.



IMG_0046 Scale model of HP POD

Enhanced by Zemanta

Viewed 8313 times by 1820 viewers

Be the first to comment - What do you think?
Posted by plates55 - December 4, 2011 at 10:10 pm

Categories: Uncategorized   Tags:


Get every new post delivered to your Inbox

Join other followers