This week, the 10th annual Red Hat Summit gets underway in Boston. The open source technology event, which premiered in 2005 in New Orleans runs June 23-26 at the Hynes Convention Center. There will be more than 170 sessions and labs, 8 keynotes, parties, and receptions to showcase the latest and greatest in cloud computing, platform, virtualization, middleware, storage, and systems management technologies. There will, of course, be presentations specifically tailored for the verticals: financial services, telecommunications and healthcare and life sciences.
When you sit down and talk cloud with IT and business executives at financial services companies, the conversation shifts between public clouds and security concerns to more flexible, on-demand computing resources and new revenue opportunities, and beyond. For the most part, financial services institutions are still new users of cloud computing, and many are still developing their cloud strategies—but those leading the way are quite knowledgeable about the risks and rewards.
There’s no question DevOps is more than technology. DevOps is a trifecta of people, processes and technology. Its goal is to help IT organizations across industries, including telecommunications, more quickly deliver optimal services, best meet the needs of their internal and external customers, and foster innovation. But what are the technology pillars in a successful DevOps initiative?
There are very few industries that are as data-centric as the banking and financial services industries. Every interaction that a client or partner system has with a banking institution produces actionable data that has potential business value, as well as a level of risk associated with it. To stay competitive, financial services institutions have to capture, store and analyze all this data to more accurately forecast market movement, understand operations, screen for fraud and comply with regulations.
For these reasons, the strategic alliance of Red Hat and Hortonworks is focused on helping financial services companies adopt enterprise Apache Hadoop and create competitive advantages by improving risk management, reducing fraud and improving investment decisions.
IT in the life sciences industry is at a tipping point. Today, there are so many powerful technologies—DevOps, OpenStack, containers, software-defined storage, big data, Hadoop, the list goes on and on—that are leading enterprises to a smarter way of developing enterprise applications and to a more modern, efficient, scalable, cloud-based architecture. This is great news for the massively data-driven life sciences industry.
That said, figuring out the best architectural foundation to support this data, leverage it (and of course, monetize it) is complex. Much of what exists in the data centers of life sciences organizations is antiquated. There are proprietary systems, lots of manual processes, monolithic applications and tightly coupled integration.
With Windows 2003 end of life just around the corner, telecommunications companies that are still using that server platform should move quickly to make their migration plans. That’s because there can be significant financial, security and compliance risks if an organization is using a server platform that is past its end of life date.
“Failure to have a current, supported operating system raises significant concerns about an organization’s ability to meet regulatory compliance requirements, as well as the needs of business units, partners, and customers,” noted IT research firm IDC in its February 2015 report, “Windows Server 2003 end of life: An opportunity to evaluate IT strategy.”
This week, IT and business executives from investment and trading firms are gathering in Chicago for The Trading Show Chicago 2015. There are plenty of presentations, roundtables, panels, and networking during the two-day conference, and attendees are hearing all about how to innovate using big data, exchange technology, cloud computing, and more.
More and more companies are choosing OpenStack as their cloud computing infrastructure, and for many, that’s because they want a cloud platform that is flexible, avoids a vendor lock-in, cuts costs and also is a strategic asset to the business operations. But what are the key IT elements a company needs to consider when architecting and implementing an OpenStack Cloud?
There’s plenty to consider. Organizations need to consider how they’ll use the cloud, which business processes will leverage the cloud, and what their long-term goals are for the cloud. But a good starting point is the foundation. What will the network topology be? Should it be built using OpenStack’s Nova or Neutron? Something else? And what storage should be used?
The life sciences industry is an industry in transformation. Big data, mobility, advanced analytics, and more—with cloud computing amidst it all—are providing new opportunities for companies to innovate, accelerate research and development, and even cut costs. They’ll have to proceed, of course, with caution, considering the highly-regulated environment in which life sciences companies operate. So what’s the best course, and what should the life sciences data center of the future look like?
Most financial services firms developing their cloud computing strategies and implementations are currently focused on infrastructure, and for now that’s to be expected. But what comes next? Once you’ve built your foundation (and a wise choice would be to build it on OpenStack), it’s going to be time to start thinking about capacity-, storage- and network-on-demand services; application development; and applications capable of increasing agility and business responsiveness. These next steps are essential to the reason your organization decided to build a cloud in the first place. It’s also where technologies like Business Process Management (BPM) come into play.