Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Suse Goes 64-bit ARM Servers

July 28, 2015 by  
Filed under Computing

Comments Off on Suse Goes 64-bit ARM Servers

Suse wants to speed the development of server systems based on 64-bit ARM processors.

The outfit said that it is making available to its partners a version of Suse Linux Enterprise 12 ported to ARM’s 64-bit architecture (AArch64).

This will enable them to develop, test and deliver products to the market based on ARM chips.

Suse has also implemented support for AArch64 into its openSUSE Build Service. This allows the community to build packages against real 64-bit ARM hardware and the Suse Linux Enterprise 12 binaries.

Hopefully this will improve the time to market for ARM-based solutions, the firm said.

Suse partners include chip makers AMD AppliedMicro and Cavium, while Dell, HP and SoftIron. Suse wants ARM processors to be part of a scalable technology platform in the data centre.

Through participation in the programme, partners will be able to build solutions for various applications, from purpose-built appliances for security, medical and network functions, to hyperscale computing, distributed storage and software-defined networking.

There are multiple vendors using the same core technology licensed from ARM. This provides a common base for the OS vendors, like Suse, to build support in their kernel.

Suse has some competition for ARM-based systems. Last year, Red Hat started up its ARM Partner Early Access Programme (PEAP), while Canonical has offered ARM support in its Ubuntu platform for several years now, including a long-term support (LTS) release last year that included the OpenStack cloud computing framework.

Source

China Keeps Supercomputing Title

July 24, 2015 by  
Filed under Computing

Comments Off on China Keeps Supercomputing Title

A supercomputer developed by China’s National Defense University still is the fastest publically known computer in the world, while the U.S. is close to an historic low in the latest edition of the closely followed Top 500 supercomputer ranking, which was just published.

The Tianhe-2 computer, based at the National Super Computer Center in Guangzhou, has been on the top of the list for more than two years and its maximum achieved performance of 33,863 teraflops per second is almost double that of the U.S. Department of Energy’s Cray Titan supercomputer, which is at the Oak Ridge National Laboratory in Tennessee.

The IBM Sequoia computer at the Lawrence Livermore National Laboratory in California is the third fastest machine, and fourth on the list is the Fujitsu K computer at Japan’s Advanced Institute for Computational Science. The only new machine to enter the top 10 is the Shaheen II computer of King Abdullah University of Science and Technology in Saudi Arabia, which is ranked seventh.

The Top 500 list, published twice a year to coincide with supercomputer conferences, is closely watched as an indicator of the status of development and investment in high-performance computing around the world. It also provides insights into what technologies are popular among organizations building these machines, but participation is voluntary. It’s quite possible a number of secret supercomputers exist that are not counted in the list.

With 231 machines in the Top 500 list, the U.S. remains the top country in terms of the number of supercomputers, but that’s close to the all-time low of 226 hit in mid-2002. That was right about the time that China began appearing on the list. It rose to claim 76 machines this time last year, but the latest count has China at 37 computers.

The Top 500 list is compiled by supercomputing experts at the University of Mannheim, Germany; the University of Tennessee, Knoxville; and the Department of Energy’s Lawrence Berkeley National Laboratory.

Source

IBM Partners With BOX

July 6, 2015 by  
Filed under Computing

Comments Off on IBM Partners With BOX

IBM and BOX have signed a global agreement to combine their strengths into a cloud powerhouse.

The star-crossed ones said in a joint statement: “The integration of IBM and Box technologies, combined with our global cloud capabilities and the ability to enrich content with analytics, will help unlock actionable insights for use across the enterprise.”

Box will bring its collaboration and productivity tools to the party, while IBM brings social, analytic, infrastructure and security services.

The move is described as a strategic alliance and will see the two companies jointly market products under a co-banner.

IBM will enable the use of Box APIs in enterprise apps and web services to make a whole new playground for developers.

The deal will see Box integrate IBM’s content management, including content capture, extraction, analytics, case management and governance. Also aboard will be Watson Analytics to study in depth the content being stored in Box.

Box will also be integrated into IBM Verse and IBM Connections to allow full integration for email and social.

IBM’s security and consulting services will be part of the deal, and the companies will work together to create mobile apps for industries under the IBM MobileFirst programme.

Finally, the APIs for Box will be enabled in Bluemix meaning that anyone working on rich apps in the cloud can make Box a part of their creation.

Box seems to be the Nick Clegg to IBM’s ham-faced posh-boy robot in this relationship, but is in fact bringing more than you’d think to the party with innovations delivered by its acquisition of 3D modelling company Verold.

What’s more, the results of these collaborations should allow another major player to join Microsoft and Google in the wars over productivity platforms.

It was announced today that Red Hat and Samsung are forming their own coalition to bring enterprise mobile out of the hands of the likes of IBM and Apple which already have a cool thing going on with MobileFirst.

Source

IBM Buys Blue Box

June 15, 2015 by  
Filed under Computing

Comments Off on IBM Buys Blue Box

IBM HAS ACQUIRED Blue Box in an attempt to make its cloud offering even bluer. The Seattle-based company specialises in simple service-as-a-platform clouds based on OpenStack.

This, of course, fits in with IBM’s new direction of a Power PC, OpenStack cloud-based world, as demonstrated by its collaboration with MariaDB on TurboLAMP.

IBM’s move to the cloud is starting to pay off, seeing revenue of $7.7bn in the 12 months to March 2015 and growing more than 16 percent in the first quarter of this year.

The company plans to use the new acquisition to create rapid, integrating cloud-based applications and on-premise systems within the OpenStack managed cloud.

Blue Box also brings a remotely managed OpenStack to provide customers with a local cloud, better visibility control and tighter security.

“IBM is dedicated to helping our clients migrate to the cloud in an open, secure, data rich environment that meets their current and future business needs,” said IBM general manager of cloud services Jim Comfort.

“The acquisition of Blue Box accelerates IBM’s open cloud strategy, making it easier for our clients to move data and applications across clouds and adopt hybrid cloud environments.”

Blue Box will offer customers a more cohesive, consistent and simplified experience, while at the same time integrating with existing IBM packages like the Bluemix digital innovation platform. The firm also offers a single unified control panel for customer operations.

“No brand is more respected in IT than IBM. Blue Box is building a similarly respected brand in OpenStack,” said Blue Box founder and CTO Jesse Proudman.

“Together, we will deliver the technology and products businesses need to give their application developers an agile, responsive infrastructure across public and private clouds.

“This acquisition signals the beginning of new OpenStack options delivered by IBM. Now is the time to arm customers with more efficient development, delivery and lower cost solutions than they’ve seen thus far in the market.”

IBM has confirmed that it plans to help Blue Box customers to grow their technology portfolio, while taking advantage of the broader IBM product set.

Source

Will A.I. Create The Next Industrial Revolution?

June 2, 2015 by  
Filed under Computing

Comments Off on Will A.I. Create The Next Industrial Revolution?

Artificial Intelligence will be responsible for the next industrial revolution, experts in the field have claimed, as intelligent computer systems replace certain human-operated jobs.

Four computer science experts talked about how advances in AI could lead to a “hollowing out” of middle-income jobs during a panel debate hosted by ClickSoftware about the future of technology.

“It’s really important that we take AI seriously. It will lead to the fourth industrial revolution and will change the world in ways we cannot predict now,” said AI architect and author George Zarkadakis.

His mention of the “fourth industrial revolution” refers to the computerization of the manufacturing industry.

If the first industrial revolution was the mechanisation of production using water and steam power, followed by the second which introduced mass production with the help of electric power, then the third is what we are currently experiencing: the digital revolution and the use of electronics and IT to further automate production.

The fourth industrial revolution, which is sometimes referred to as Industry 4.0, is the vision of the ‘smart factory’, where cyber-physical systems monitor physical processes, create a virtual copy of the physical world and make decentralized decisions.

These cyber-physical systems communicate and cooperate with each other and humans in real time over the Internet of Things.

Dan O’Hara, professor of cognitive computing at Goldsmiths, University of London, explained that this fourth industrial revolution will not be the same kind of “hollowing out” of jobs that we saw during the last one.

“It [won’t be] manual labour replaced by automation, but it’ll be the hollowing out of middle-income jobs, medium-skilled jobs,” he said.

“The industries that will be affected the most from a replacement with automation are construction, accounts and transport. But the biggest [industry] of all, remembering this is respective to the US, is retail and sales.”

O’Hara added that many large organisations’ biggest expense is people, who already work alongside intelligent computer systems, and this area is most likely to be affected as companies look to reduce costs.

“Anything that’s working on an AI-based system is bound to be very vulnerable to the replacement by AI as it’s easily automated already,” he said.

However, while AI developments in the retail space could lead to the replacement of jobs, it is also rather promising at the same time.

Mark Bishop, professor of cognitive computing at Goldsmiths, highlighted that AI could save businesses money if it becomes smart enough to determine price variants in company spending, for example, scanning through years of an organisation’s invoice database and detecting the cheapest costs and thus saving on outgoings.

While some worry that AI will take over jobs, others have said that they will replace humans altogether.

John Lewis IT chief Paul Coby said earlier this year that the blending of AI and the IoT in the future could signal the end of civilisation as we know it.

Coby explained that the possibilities are already with us in terms of AI and that we ought to think about how “playing with the demons” could be detrimental to our future.

Apple co-founder Steve Wozniak added to previous comments from Stephen Hawking and Elon Musk with claims that “computers are going to take over from humans”.

Woz made his feelings on AI known during an interview with the Australian Financial Review, and agreed with Hawking and Musk that its potential to surpass humans is worrying.

“Computers are going to take over from humans, no question. Like people including Stephen Hawking and Elon Musk have predicted, I agree that the future is scary and very bad for people,” he said.

Source

Intel Rewards RealSense Developers

May 21, 2015 by  
Filed under Computing

Comments Off on Intel Rewards RealSense Developers

Intel has awarded  $1m to a number of developers as part of its RealSense 3D App Challenge, which was launched last year.

Announced by Intel president Renee James at Computex 2014, the RealSense App Challenge was part of Intel’s efforts to boost RealSense globally and generate software innovation around the ecosystem.

More than 7,000 software creators in 37 countries applied to compete, and 400 were selected to develop new applications for entertainment, learning and collaboration.

Several hundred developers of creative app ideas in these categories received the latest edition of the RealSense 3D Camera and RealSense software development kit, which included free tools, examples and application programing interfaces with which to develop their ideas.

Intel announced on Thursday that the grand prize winner, who picks up $100,000, is Brazilian developer Alexandre Ribeiro da Silva of Anima Games.

His Seed app requires gamers to use reflexes and rational thinking to solve puzzles. The goal of the game is to guide a little floating seed through its journey to reforest a devastated land.

The second prize of $50,000 was awarded to Canadian developer David Schnare of Kinetisense. His OrthoSense app uses RealSense to help medical professionals remotely rehabilitate a patient who has suffered a hand injury by tracking their range of movement over time.

“This practical application of human-computer interaction is an impressive example of how technology can make our lives better,” Intel said.

Another notable winner was Lee Bamber from the UK, who received recognition for his virtual 3D video maker. The app allows a user to record themselves as a 3D hologram and then transport to a variety of scenes.

Once recorded, they can then change the camera position over the course of the playback to add an extra dimension to a video blogs, storybook or v-mails, for instance.

“The idea of the app is that you can choose the backdrop then set the lighting as you would in a studio then do the acting,” Bamber explained in his video.

Doug Fisher, SVP and general manager of Intel’s Software and Services Group, said in a blog post that now the app challenge is complete “the real work begins”, as Intel Software will continue to encourage all finalists to bring products to market.

“We also will continue mobilising our resources to inspire, educate and advance innovation through programmes such as the Intel Developer Zone, where developers can engage to find new software tools and build industry relationships,” he said.

“Human-computer interactions will no longer be defined by mice, keyboards and 2D displays. Our physical and digital worlds are coming together. When they do, the opportunities for us as consumers and businesses will explode.”

Source

SUSE Brings Hadoop To IBM z Mainframes

April 1, 2015 by  
Filed under Computing

Comments Off on SUSE Brings Hadoop To IBM z Mainframes

SUSE and Apache Hadoop vendor Veristorm are teaming up to bring Hadoop to IBM z and IBM Power systems.

The result will mean that regardless of system architecture, users will be able to run Apache Hadoop within a Linux container on their existing hardware, meaning that more users than ever will be able to process big data into meaningful information to inform their business decisions.

SUSE’s Veristorm Data Hub and vStorm Enterprise Hadoop will now be available as zDoop, the first mainframe-compatible Hadoop iteration, running on SUSE Linux Enterprise Server for System z, either on IBM Power12 or Power8 machines in little-endian mode, which makes it significantly easier for x86 based software to be ported to the IBM platform.

SUSE and Veristorm have also committed to work together on educating partners and channels on the benefits of the overall package.

Naji Almahmoud, head of global business development for SUSE, said: “The growing need for big data processing to make informed business decisions is becoming increasingly unavoidable.

“However, existing solutions often struggle to handle the processing load, which in turn leads to more servers and difficult-to-manage sprawl. This partnership with Veristorm allows enterprises to efficiently analyse their mainframe data using Hadoop.”

Veristorm launched Hadoop for Linux in April of last year, explaining that it “will help clients to avoid staging and offloading of mainframe data to maintain existing security and governance controls”.

Sanjay Mazumder, CEO of Veristorm, said that the partnership will help customers “maximize their processing ability and leverage their richest data sources” and deploy “successful, pragmatic projects”.

SUSE has been particularly active of late, announcing last month that its software-defined Enterprise Storage product, built around the open source Ceph framework, was to become available as a standalone product for the first time.

Source

IBM Debuts New Mainframe

March 27, 2015 by  
Filed under Computing

Comments Off on IBM Debuts New Mainframe

IBM has started shipping its all-new first z13 mainframe computer.

IBM has high hopes the upgraded model will generate solid sales based not only on usual customer patterns but its design focus aimed at helping them cope with expanding mobile usage, analysis of data, upgrading security and doing more “cloud” remote computing.

Mainframes are still a major part of the Systems and Technology Group at IBM, which overall contributed 10.8 percent of IBM’s total 2014 revenues of $92.8 billion. But the z Systems and their predecessors also generate revenue from software, leasing and maintenance and thus have a greater financial impact on IBM’s overall picture.

The new mainframe’s claim to fame is to use simultaneous multi-threading (SMT) to execute two instruction streams (or threads) on a processor core which delivers more throughput for Linux on z Systems and IBM z Integrated Information Processor (zIIP) eligible workloads.

There is also a single Instruction Multiple Data (SIMD), a vector processing model providing instruction level parallelism, to speed workloads such as analytics and mathematical modeling. All this means COBOL 5.2 and PL/I 4.5 exploit SIMD and improved floating point enhancements to deliver improved performance over and above that provided by the faster processor.

Its on chip cryptographic and compression coprocessors receive a performance boost improving both general processors and Integrated Facility for Linux (IFL) cryptographic performance and allowing compression of more data, helping tosave disk space and reducing data transfer time.

There is also a redesigned cache architecture, using eDRAM technology to provide twice as much second level cache and substantially more third and fourth level caches compared to the zEC12. Bigger and faster caches help to avoid untimely swaps and memory waits while maximisng the throughput of concurrent workload Tom McPherson, vice president of z System development, said that the new model was not just about microprocessors, though this model has many eight-core chips in it. Since everything has to be cooled by a combination of water and air, semiconductor scaling is slowing down, so “you have to get the value by optimizing.

The first real numbers on how the z13 is selling won’t be public until comments are made in IBM’s first-quarter report, due out in mid-April, when a little more than three weeks’ worth of billings will flow into it.

The company’s fiscal fortunes have sagged, with mixed reviews from both analysts and the blogosphere. Much of that revolves around IBM’s lag in cloud services. IBM is positioning the mainframe as a prime cloud server, one of the systems that is actually what cloud computing goes to and runs on.

Source

Can Linux Succeed On The Desktop?

March 25, 2015 by  
Filed under Computing

Comments Off on Can Linux Succeed On The Desktop?

Every three years I install Linux and see if it is ready for prime time yet, and every three years I am disappointed. What is so disappointing is not so much that the operating system is bad, it has never been, it is just that who ever designs it refuses to think of the user.

To be clear I will lay out the same rider I have for my other three reviews. I am a Windows user, but that is not out of choice. One of the reasons I keep checking out Linux is the hope that it will have fixed the basic problems in the intervening years. Fortunately for Microsoft it never has.

This time my main computer had a serious outage caused by a dodgy Corsair (which is now a c word) power supply and I have been out of action for the last two weeks. In the mean time I had to run everything on a clapped out Fujitsu notebook which took 20 minutes to download a webpage.

One Ubuntu Linux install later it was behaving like a normal computer. This is where Linux has always been far better than Windows – making rubbish computers behave. I could settle down to work right? Well not really.

This is where Linux has consistently disqualified itself from prime-time every time I have used it. Going back through my reviews, I have been saying the same sort of stuff for years.

Coming from Windows 7, where a user with no learning curve can install and start work it is impossible. Ubuntu can’t. There is a ton of stuff you have to upload before you can get anything that passes for an ordinary service. This uploading is far too tricky for anyone who is used to Windows.

It is not helped by the Ubuntu Software Centre which is supposed to make like easier for you. Say that you need to download a flash player. Adobe has a flash player you can download for Ubuntu. Click on it and Ubuntu asks you if you want to open this file with the Ubuntu Software Center to install it. You would think you would want this right? Thing is is that pressing yes opens the software center but does not download Adobe flash player. The center then says it can’t find the software on your machine.

Here is the problem which I wrote about nearly nine years ago – you can’t download Flash or anything proprietary because that would mean contaminating your machine with something that is not Open Sauce.

Sure Ubuntu will download all those proprietary drivers, but you have to know to ask – an issue which has been around now for so long it is silly. The issue of proprietary drives is only a problem for those who are hard core open saucers and there are not enough numbers of them to keep an operating system in the dark ages for a decade. However, they have managed it.

I downloaded LibreOffice and all those other things needed to get a basic “windows experience” and discovered that all those typefaces you know and love are unavailable. They should have been in the proprietary pack but Ubuntu has a problem installing them. This means that I can’t share documents in any meaningful way with Windows users, because all my formatting is screwed.

LibreOffice is not bad, but it really is not Microsoft Word and anyone who tries to tell you otherwise is lying.

I download and configure Thunderbird for mail and for a few good days it actually worked. However yesterday it disappeared from the side bar and I can’t find it anywhere. I am restricted to webmail and I am really hating Microsoft’s outlook experience.

The only thing that is different between this review and the one I wrote three years ago is that there are now games which actually work thanks to Steam. I have not tried this out yet because I am too stressed with the work backlog caused by having to work on Linux without regular software, but there is an element feeling that Linux is at last moving to a point where it can be a little bit useful.

So what are the main problems that Linux refuses to address? Usability, interface and compatibility.

I know Ubuntu is famous for its shit interface, and Gnome is supposed to be better, but both look and feel dated. I also hate Windows 8′s interface which requires you to use all your computing power to navigate through a touch screen tablet screen when you have neither. It should have been an opportunity for Open saucers to trump Windows with a nice interface – it wasn’t.

You would think that all the brains in the Linux community could come up with a simple easy to use interface which lets you have access to all the files you need without much trouble. The problem here is that Linux fans like to tinker they don’t want usability and they don’t have problems with command screens. Ordinary users, particularly more recent generations will not go near a command screen.

Compatibly issues for games has been pretty much resolved, but other key software is missing and Linux operators do not seem keen to get them on board.

I do a lot of layout and graphics work. When you complain about not being able to use Photoshop, Linux fanboys proudly point to GIMP and say that does the same things. You want to grab them down the throat and stuff their heads down the loo and flush. GIMP does less than a tenth of what Photoshop can do and it does it very badly. There is nothing that can do what CS or any real desktop publishers can do available on Linux.

Proprietary software designed for real people using a desktop tends to trump anything open saucy, even if it is producing a technology marvel.

So in all these years, Linux has not attempted to fix any of the problems which have effectively crippled it as a desktop product.

I will look forward to next week when the new PC arrives and I will not need another Ubuntu desktop experience. Who knows maybe they will have sorted it in three years time again.

Source

IBM Goes Bare Metal

March 18, 2015 by  
Filed under Computing

Comments Off on IBM Goes Bare Metal

IBM has announced the availability of OpenPower servers as part of the firm’s SoftLayer bare metal cloud offering.

OpenPower, a collaborative foundation run by IBM in conjunction with Google and Nvidia, offers a more open approach to IBM’s Power architecture, and a more liberal licence for the code, in return for shared wisdom from member organisations.

Working in conjunction with Tyan and Mellanox Technologies, both partners in the foundation, the bare metal servers are designed to help organisations easily and quickly extend infrastructure in a customized manner.

“The new OpenPower-based bare metal servers make it easy for users to take advantage of one of the industry’s most powerful and open server architectures,” said Sonny Fulkerson, CIO at SoftLayer.

“The offering allows SoftLayer to deliver a higher level of performance, predictability and dependability not always possible in virtualised cloud environments.”

Initially, servers will run Linux applications and will be based on the IBM Power8 architecture in the same mold as IBM Power system servers.

This will later expand to the Power ecosystem and then to independent software vendors that support Linux on Power application development, and are migrating applications from x86 to the Power architecture.

OpenPower servers are based on open source technology that extends right down to the silicon level, and can allow highly customised servers ranging from physical to cloud, or even hybrid.

Power systems are already installed in SoftLayer’s Dallas data centre, and there are plans to expand to data centres throughout the world. The system was first rolled out in 2014 as part of the Watson portfolio.

Prices will be announced when general availability arrives in the second quarter.

Source

« Previous PageNext Page »