During the fourth phase (2012-2014) addressed new topics crucial to the CERN scientific programme, such as cloud computing, business analytics, the next generation of hardware, and security for the myriads of network devices.
PARIS: Cern, the European physics laboratory and home of the large Hadron Collider (LHC), is using OpenStack as part of its key IT infrastructure, demonstrating how open source software can evolve to meet even the most extreme customer requirements.
PARIS – Building the Large Hadron Collider itself was doubtless a massive feat, but the machine – a nearly 17-mile ring more than 300 feet underground on the Franco-Swiss border – is useless without the huge data storage and computing capacity needed to analyze the ungodly amount of data it generates when microscopic particles get recorded smashing into each other at extreme speeds.
When the head of infrastructure services at CERN tells you that he has come to the conclusion that there’s nothing intrinsically “special” about the systems at the multi-billion atom-smasher, you naturally want to check you’ve heard correctly.
Data volumes are growing at unprecedented rates, making it more complex for organizations to effectively use that information to make strategic business decisions and gain a competitive advantage. To drive greater value from their data and take advantage of emerging technology trends such as cloud computing and big data, companies must have the ability to move data across highly diverse IT environments and firewalls with very low latency.
Data volumes are growing at unprecedented rates, making it more complex for organizations to effectively use that information to make strategic business decisions and gain a competitive advantage. To drive greater value from their data and take advantage of emerging technology trends such as cloud computing and big data, companies must have the ability to move data across highly diverse IT environments and firewalls with very low latency.
Data volumes are growing at unprecedented rates, making it more complex for organizations to effectively use that information to make strategic business decisions and gain a competitive advantage. To drive greater value from their data and take advantage of emerging technology trends such as cloud computing and big data, companies must have the ability to move data across highly diverse IT environments and firewalls with very low latency.
Data volumes are growing at unprecedented rates, making it more complex for organizations to effectively use that information to make strategic business decisions and gain a competitive advantage. To drive greater value from their data and take advantage of emerging technology trends such as cloud computing and big data, companies must have the ability to move data across highly diverse IT environments and firewalls with very low latency.
Data volumes are growing at unprecedented rates, making it more complex for organizations to effectively use that information to make strategic business decisions and gain a competitive advantage. To drive greater value from their data and take advantage of emerging technology trends such as cloud computing and big data, companies must have the ability to move data across highly diverse IT environments and firewalls with very low latency.
Data volumes are growing at unprecedented rates, making it more complex for organizations to effectively use that information to make strategic business decisions and gain a competitive advantage. To drive greater value from their data and take advantage of emerging technology trends such as cloud computing and big data, companies must have the ability to move data across highly diverse IT environments and firewalls with very low latency.