Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

RedHat Buys InkTank

May 21, 2014 by  
Filed under Computing

Comments Off on RedHat Buys InkTank

Red Hat has announced that it bought storage system provider Inktank.

Inktank is the company behind Ceph, the cloud based objects and block storage software package used in a number of Openstack cloud configurations.

Ceph will continue to be marketed alongside Red Hat’s own GlusterFS in a deal worth $175m, which the company does not believe will adversely affect its financial forecasts for the year.

In a statement, Brian Stevens, EVP and CTO of Red Hat said, “We’re thrilled to welcome Inktank to the Red Hat family. They have built an incredibly vibrant community that will continue to be nurtured as we work together to make open the de facto choice for software-defined storage. Inktank has done a brilliant job assembling a strong ecosystem around Ceph and we look forward to expanding on this success together.”

As part of the deal Ceph’s Monitoring and Diagnostics tool Calamari will also become open source, allowing users to add their own modules and functionality.

Inktank founder Sage Weil used his blog to assure users that the two storage systems will be treated with equal respect. “Red Hat intends to administer the Ceph trademark in a manner that protects the ecosystem as a whole and creates a level playing field where everyone is held to the same standards of use.”

Red Hat made the announcement fresh from Red Hat Summit in New York, where the company reaffirmed that it is the Linux distribution of choice at the CERN supercollider in Switzerland.

The Inktank deal is set to close later this month.

Source

RedHat Goes Atomic

April 30, 2014 by  
Filed under Computing

Comments Off on RedHat Goes Atomic

The Red Hat Summit kicked off in San Francisco on Tuesday, and continued today with a raft of announcements.

Red Hat launched a new fork of Red Hat Enterprise Linux (RHEL) with the title “Atomic Host”. The new version is stripped down to enable lightweight deployment of software containers. Although the mainline edition also support software containers, this lightweight version improves portability.

This is part of a wider Red Hat initiative, Project Atomic, which also sees virtualisation platform Docker updated as part of the ongoing partnership between the two organisations.

Red Hat also announced a release candidate (RC) for Red Hat Enterprise Linux 7. The beta version has already been downloaded 10,000 times. The Atomic Host fork is included in the RC.

Topping all that is the news that Red Hat’s latest stable release, RHEL 6.5 has been deployed at the Organisation for European Nuclear Research – better known as CERN.

The European laboratory, which houses the Large Hadron Collider (LHC) and was birthplace of the World Wide Web has rolled out the latest versions of Red Hat Enterprise Linux, Red Hat Enterprise Virtualisation and Red Hat Technical Account Management. Although Red Hat has a long history with CERN, this has been a major rollout for the facility.

The logging server of the LHC is one of the areas covered by the rollout, as are the financial and human resources databases.

The infrastructure comprises a series of dual socket servers, virtualised on Dell Poweredge M610 servers with up to 256GB RAM per server and full redundancy to prevent the loss of mission critical data.

Niko Neufeld, deputy project leader at the Large Hadron Collider, said, “Our LHCb experiment requires a powerful, very reliable and highly available IT environment for controlling and monitoring our 70 million CHF detectors. Red Hat Enterprise Virtualization is at the core of our virtualized infrastructure and complies with our stringent requirements.”

Other news from the conference includes the launch of Openshift Marketplace, allowing customers to try solutions for cloud applications, and the release of Red Hat Jboss Fuse 6.1 and Red Hat Jboss A-MQ 6.1, which are standards based integration and messaging products designed to manage everything from cloud computing to the Internet of Things.

Source

Oracle Updates NoSQL

April 22, 2014 by  
Filed under Computing

Comments Off on Oracle Updates NoSQL

Oracle has announced the availability of the latest edition of its NoSQL datatabase.

NoSQL is Oracle’s distributed key-value database. Now in it’s third version, the enhancements this time are heavily centred around security and business continuity.

Oracle NoSQL 3.0 features improvements in security with cluster-wide password based user authentication and integration with Oracle Wallet. Session level Secure Socket Layer (SSL) encryption and network port restriction are also included.

For disaster recovery and prevention, there’s automatic fail-over to metro-area secondary data centres, while secondary server zones can be used to offload read-only workloads to take the pressure off primary servers under stress.

For developers, there is added support for tabular data models that Oracle claims will simplify application design and improve integration with SQL based applications, while secondary indexing improves query performance.

“Oracle NoSQL 3.0 helps organisations fill the gap in skills, security and performance by delivering […] enterprise-class NoSQL database that empowers database developers and DBAs to easily, intuitively and securely build and deploy next generation applications,” said Oracle’s EVP of Database Server Technologies, Andrew Mendelsohn.

It’s already been a big week for the SQL community with NoSQL arriving on MariaDB for the first time, courtesy of a tie-up between SkySQL, Google and IBM on Tuesday, while yesterday Fusion-IO announced the use of Non-volatile memory (NVM) compression in MySQL to increase the capacity of SSD storage.

Both the community and enterprise versions of Oracle NoSQL Database 3.0 are available for download now from the Oracle Technology Network.

Source

Ubuntu Cross-Platform Delayed

February 26, 2014 by  
Filed under Computing

Comments Off on Ubuntu Cross-Platform Delayed

Ubuntu will not offer cross-platform apps as soon as it had hoped.

Canonical had raised hopes that its plan for Ubuntu to span PCs and mobile devices would be realised with the upcoming Ubuntu 14.04 release, providing a write-once, run-on-many template similar to that planned by Google for its Chrome OS and Android app convergence.

This is already possible on paper and the infrastructure is in place on smartphone and tablet versions of Ubuntu through its new Unity 8 user interface.

However, Canonical has decided to postpone the rollout of Unity 8 for desktop machines, citing security concerns, and it will now not appear along with the Mir display server this coming autumn.

This will apply only to apps in the Ubuntu store, and in the true spirit of open source, anyone choosing to step outside that ecosystem will be able to test the converged Ubuntu before then.

Ubuntu community manager Jono Bacon told Ars Technica, “We don’t plan on shipping apps in the new converged store on the desktop until Unity 8 and Mir lands.

“The reason is that we use app insulation to (a) run apps securely and (b) not require manual reviews (so we can speed up the time to get apps in the store). With our plan to move to Mir, our app insulation doesn’t currently insulate against X apps sniffing events in other X apps. As such, while Ubuntu SDK apps in click packages will run on today’s Unity 7 desktop, we don’t want to make them readily available to users until we ship Mir and have this final security consideration in place.

“Now, if a core-dev or motu wants to manually review an Ubuntu SDK app and ship it in the normal main/universe archives, the security concern is then taken care of with a manual review, but we are not recommending this workflow due to the strain of manual reviews.”

As well as the aforementioned security issues, there are still concerns that cross-platform apps don’t look quite as good on the desktop as native desktop versions and the intervening six months will be used to polish the user experience.

Getting the holistic experience right is essential for Ubuntu in order to attract OEMs to the converged operating system. Attempts to crowdfund its own Ubuntu handset fell short of its ambitious $20m target, despite raising $10.2 million, the single largest crowdfunding total to date.

Source

Red Hat Releases Linux E-Beta

December 27, 2013 by  
Filed under Computing

Comments Off on Red Hat Releases Linux E-Beta

Red Hat has made available a beta of Red Hat Enterprise Linux 7 (RHEL 7) for testers, just weeks after the final release of RHEL 6.5 to customers.

RHEL 7 is aimed at meeting the requirements of future applications as well as delivering scalability and performance to power cloud infrastructure and enterprise data centers.

Available to download now, the RHEL 7 beta introduces a number of enhancements, including better support for Linux Containers, in-place upgrades, XFS as the default file system, improved networking support and improved compatibility with Windows networks.

Inviting customers, partners, and members of the public to download the RHEL 7 beta and provide feedback, Red Hat is promoting the upcoming version as its most ambitious release to date. The code is based on Red Hat’s community developed Fedora 19 distribution of Linux and the upstream Linux 3.10 kernel, the firm said.

“Red Hat Enterprise Linux 7 is designed to provide the underpinning for future application architectures while delivering the flexibility, scalability, and performance needed to deploy across bare metal, virtual machines, and cloud infrastructure,” Senior Product Marketing Manager Kimberly Craven wrote on the Red Hat Enterprise Linux blog.

These improvements address a number of key areas, including virtualisation, management and interoperability.

Linux Containers, for example, was partially supported in RHEL 6.5, but this release enables applications to be created and deployed using Linux Container technology, such as the Docker tool. Containers offers operating system level virtualisation, which provides isolation between applications without the overhead of virtualising the entire server.

Red Hat said it is now supporting an in-place upgrade feature for common server deployment types. This will allow customers to migrate existing RHEL 6.5 systems to RHEL 7 without downtime.

RHEL 7 also makes the switch to XFS as its default file system, supporting file configurations up to 500TB, while ext4 file systems are now supported up to 50TB in size and B-tree file system (btrfs) implementations are available for users to test.

Interoperability with Windows has also been improved, with Red Hat now including the ability to bridge Windows and Linux infrastructure by integrating RHEL 7 and Samba 4.1 with Microsoft Active Directory domains. Red Hat Enterprise Linux Identity Management can also be deployed in a parallel trust zone alongside Active Directory, the firm said.

On the networking side, RHEL 7 provides support for 40Gbps Ethernet, along with improved channel bonding, TCP performance improvements and low latency socket poll support.

Other enhancements include support for very large scale storage configurations, including enterprise storage arrays, and uniform management tools for networking, storage, file systems, identities and security using the OpenLMI framework.

Source

Does Intel Need Help?

October 7, 2013 by  
Filed under Computing

Comments Off on Does Intel Need Help?

As time runs out for Intel to bring its Internet-based TV service by the end of the year, the outfit has approached Samsung and Amazon to ask them to lend a hand. Intel has asked about providing funding and distribution for the service. It looks like the set-top box project could be scrapped if a strategic partner isn’t found soon.

OnCue was supposed to allow users to watch live TV, on demand, and other offerings. Intel said it would provide the hardware and services directly to consumers and that the box would come with a camera that can detect who is in front of the TV. More than 300 engineers are working on the project under Erik Huggers, the head of Intel Media. A version of the service running on Intel hardware is testing with 3,000 Intel employees. Goodness knows what content they are running. Intel is having difficulty getting content deals.

Intel has yet to announce any TV programming partners, and Time Warner Cable and other cable TV providers have been pressuring channel owners to shun pacts with Intel and other Internet-based TV providers. Samsung, which ships millions of smart TVs, could distribute the service as a bundle, while Amazon could provide access to its growing library of movies and TV shows.

Source

Microsoft Updates Azure

April 26, 2013 by  
Filed under Computing

Comments Off on Microsoft Updates Azure

Microsoft has rolled out a major update to its Azure cloud computing service and said that it will match Amazon on price.

Last year Microsoft announced it would preview a host of changes to its Azure cloud computing service including new virtual machine configurations, a virtual private network and a new Azure software development kit. Now the firm has taken those features out of preview and made them generally available in what it is promoting as the largest single update to Windows Azure to date.

Since Microsoft announced most of the features in its “hybrid cloud” last June, the firm said the only changes from the preview release to today’s public release are higher memory capacity and higher performance compute nodes. However the firm touted its Windows Azure Virtual Network as a way for customers to view cloud based services as if those were located on their premises.

Microsoft couldn’t rely on features alone to take the fight to Amazon and its Web Services division. Amazon’s cloud service is the biggest rival to Microsoft Azure and has a reputation for cutting prices aggressively. Now Microsoft has said it will do the same in a bid “to take the price discussion off the table”.

Michael Newberry, Windows Azure lead at Microsoft UK said that companies are in a process of moving applications that presently reside on servers located in the office onto the cloud. He said, “It is important that we get them through the process, price shouldn’t be a barrier for the customer to choose the best cloud provider.

“At the end of the day it should be about different technical facilities, what is the right environment for a particular workload, a particular application scenario. And that’s why we wanted to take the price discussion off the table and say ‘look, we know prices are changing and this is a market that is developing, but lets make this about the best environment, the best architecture, the best cloud environment for your particular customer.”

Newberry said that Microsoft’s Windows Azure service will appeal to those customers who want to make use of existing applications rather than develop ones specifically for cloud deployment. He said, “With customers who have existing infrastructure, existing applications, existing datacenters, that’s not something they want to throwaway. They still want to take advantage of cloud technologies, either in terms of private cloud, or using the public cloud as a spiking mechanism – an overflow if you will – for their existing on premise environment.”

Microsoft has also started to offer support for Linux on its Azure cloud service. Newberry said customers should have no problem running open source software or Linux on its services. However the firm does see its Windows Azure cloud service being particularly enticing for those firms that already run their network infrastructure services using Microsoft’s software, such as Active Directory, SQL Server and Sharepoint.

With Microsoft saying it will match Amazon’s pricing, the cloud provider industry might start to see a focus on performance rather than simply competing on low prices to attract customers.

Source

Hitachi Bringing New Xeon Servers To Market

March 12, 2012 by  
Filed under Computing

Comments Off on Hitachi Bringing New Xeon Servers To Market

Hitachi Data Systems announced that it will expand its family of blade and rack server products for the enterprise market. The forthcoming Hitachi Compute Systems will be based on the new Intel Xeon processor E5-2600 family.

Roberto Basilio, vice president, Infrastructure Platforms Product Management, Hitachi Data Systems said that by leveraging the new Intel Xeon processor E5 family, upcoming Hitachi Compute Systems will feature faster performance, higher density and greater energy efficiency. The servers are being designed for converged data centres. They come pre-configured and optimised for leading applications such as Microsoft Exchange 2010, SAP HANA and solutions with VMware.

He said that the Intel Xeon processor E5-2600 product family provides exceptional energy efficiency, increased security, flexible performance and the opportunity for streamlining customer’s data centres. The current range of Hitachi Compute Systems consists of two blade server product lines, Hitachi Compute Blade 2000 and Hitachi Compute Blade 320, both of which are intended for high performance, high availability applications. The portfolio also includes a family of rack-optimised servers, Hitachi Compute Rack, that are the foundation for dedicated, packaged solutions such as the company’s award-winning object store, Hitachi Content Platform (HCP).

Source…

Microsoft Offers Windows Azure Trial

February 23, 2011 by  
Filed under Computing

Comments Off on Microsoft Offers Windows Azure Trial

Microsoft is offering up to 750 free hours of use on its Azure service to lure developers into trying cloud computing, the company announced Tuesday.

“This extended free trial will allow developers to try out the Windows Azure platform without the need for up-front investment costs,” a Microsoft blog entry explained.

The offer arrives but a few weeks after Microsoft promoted Satya Nadella to head its $15 billion server and tools business, which includes the Azure offering. The company raved about Nadella’s experience in ramping up large-scale consumer-focused cloud services like Bing and hoped he could bring the same magic to getting Microsoft cloud services into the enterprise as well.

Participants of the free trial can choose one of two options: 750 hours of use on an Extra Small Compute Instance, or 25 hours on a Small Compute Instance. An Extra Small Compute Instance offers the equivalent of a 1GHz processor with 768MB of working memory, which normally costs $.05 an hour. The Small Compute Instance has a 1.6GHz processor, 1.75GB of working memory, and typically costs $0.12 an hour.  Read More…..