Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Apple Jumps On The AR Bandwagon

August 26, 2016 by  
Filed under Around The Net

Comments Off on Apple Jumps On The AR Bandwagon

Apple is trying to convince the world it is “coming up with something new” by talking a lot about Artificial Reality.

It is a fairly logical development, the company has operated a reality distortion field to create an alternative universe where its products are new and revolutionary and light years ahead of everyone else’s. It will be curious to see how Apple integrates its reality with the real world, given that it is having a problem with that.

Apple CEO Tim Cook has been doing his best to convince the world that Apple really is working on something. He needs to do this as the iPhone cash cow starts to dry up and Jobs Mob appears to have no products to replace it.

In an interview with The Washington Post published Sunday, Cook said Apple is “doing a lot of things” with augmented reality (AR), the technology that puts digital images on top of the real world.
He said:

“I think AR is extremely interesting and sort of a core technology. So, yes, it’s something we’re doing a lot of things on behind that curtain we talked about.”

However Apple is light years behind working being done by Microsoft with its Microsoft’s HoloLens headset and the startup Magic Leap’s so-called cinematic reality that’s being developed now.

Cook appears to retreat to AR whenever he is under pressure. But so far he has never actually said that the company is developing any.

Appple has also snapped up several companies and experts in the AR space. And in January, the Financial Times claimed that the company has a division of hundreds of people researching the technology.
But AR would be a hard fit to get a product out which fits Apple’s ethos and certainly not one for years. Meanwhile it is unlikely we will see anything new before Microsoft and Google get their products out.

Courtesy-Fud

 

Are Quantum Computers On The Horizon?

March 18, 2016 by  
Filed under Computing

Comments Off on Are Quantum Computers On The Horizon?

Massachusetts Institute of Technology (MIT) and Austria’s University of Innsbruck claim to have put together a working quantum computer capable of solving a simple mathematical problem.

The architecture they have devised ought to be relatively easy to scale, and could therefore form the basis of workable quantum computers in the future – with a bit of “engineering effort” and “an enormous amount of money”, according to Isaac Chuang, professor of physics, electrical engineering and computer science at MIT.

Chuang’s team has put together a prototype comprising the first five quantum bits (or qubits) of a quantum computer. This is being tested on mathematical factoring problems, which could have implications for applications that use factoring as the basis for encryption to keep information, including credit card details, secure.

The proof-of-concept has been applied only to the number 15, but the researchers claim that this is the “first scalable implementation” of quantum computing to solve Shor’s algorithm, a quantum algorithm that can quickly calculate the prime factors of large numbers.

“The team was able to keep the quantum system stable by holding the atoms in an ion trap, where they removed an electron from each atom, thereby charging it. They then held each atom in place with an electric field,” explained MIT.

Chuang added: “That way, we know exactly where that atom is in space. Then we do that with another atom, a few microns away – [a distance] about 100th the width of a human hair.

“By having a number of these atoms together, they can still interact with each other because they’re charged. That interaction lets us perform logic gates, which allow us to realise the primitives of the Shor factoring algorithm. The gates we perform can work on any of these kinds of atoms, no matter how large we make the system.”

Chuang is a pioneer in the field of quantum computing. He designed a quantum computer in 2001 based on one molecule that could be held in ‘superposition’ and manipulated with nuclear magnetic resonance to factor the number 15.

The results represented the first experimental realisation of Shor’s algorithm. But the system wasn’t scalable as it became more difficult to control as more atoms were added.

However, the architecture that Chuang and his team have put together is, he believes, highly scalable and will enable the team to build quantum computing devices capable of solving much bigger mathematical factors.

“It might still cost an enormous amount of money to build, [and] you won’t be building a quantum computer and putting it on your desktop anytime soon, but now it’s much more an engineering effort and not a basic physics question,” said Chuang.

In other quantum computing news this week, the UK government has promised £200m to support engineering and physical sciences PhD students and fuel UK research into quantum technologies, although most of the cash will be spent on Doctoral Training Partnerships rather than trying to build workable quantum computing prototypes.

Courtesy-TheInq

Courtesy-TheInq

Seagate Goes 8TB For Surveillance

November 13, 2015 by  
Filed under Computing

Comments Off on Seagate Goes 8TB For Surveillance

Seagate has become the first hard drive company to create an 8TB unit aimed specifically at the surveillance market, targeting system integrators, end users and system installers.

The Seagate Surveillance HDD, as those wags in marketing have named it, is the highest capacity of any specialist drive for security camera set-ups, and Seagate cites its main selling points as maximizing uptime while removing the need for excess support.

“Seagate has worked closely with the top surveillance manufacturers to evolve the features of our Surveillance HDD products and deliver a customized solution that has precisely matched market needs in this evolving space for the last 10 years,” said Matt Rutledge, Seagate’s senior vice president for client storage.

“With HD recordings now standard for surveillance applications, Seagate’s Surveillance HDD product line has been designed to support these extreme workloads with ease and is capable of a 180TB/year workload, three times that of a standard desktop drive.

“It also includes surveillance-optimized firmware to support up to 64 cameras and is the only product in the industry that can support surveillance solutions, from single-bay DVRs to large multi-bay NVR systems.”

The 3.5in drive is designed to run 24/7 and is able to capture 800 hours of high-definition video from up to 64 cameras simultaneously, making it ideal for shopping centers, urban areas, industrial complexes and anywhere else you need to feel simultaneously safe and violated. Its capacity will allow 6PB in a 42U rack.

Included in the deal is the Seagate Rescue Service, capable of restoring lost data in two weeks if circumstances permit, and sold with end users in mind for whom an IT support infrastructure is either non-existent or off-site. The service has a 90 percent success rate and is available as part of the drive cost for the first three years.

Seagate demonstrated the drive today at the China Public Security Expo. Where better than the home of civil liberty infringement to show off the new drive?

Earlier this year, Seagate announced a new co-venture with SSD manufacturer Micron, which will come as a huge relief after the recent merger announcement between WD and SanDisk.

Courtesy-http://www.thegurureview.net/computing-category/seagate-goes-8tb-for-surveillance.html

Will A.I. Create The Next Industrial Revolution?

June 2, 2015 by  
Filed under Computing

Comments Off on Will A.I. Create The Next Industrial Revolution?

Artificial Intelligence will be responsible for the next industrial revolution, experts in the field have claimed, as intelligent computer systems replace certain human-operated jobs.

Four computer science experts talked about how advances in AI could lead to a “hollowing out” of middle-income jobs during a panel debate hosted by ClickSoftware about the future of technology.

“It’s really important that we take AI seriously. It will lead to the fourth industrial revolution and will change the world in ways we cannot predict now,” said AI architect and author George Zarkadakis.

His mention of the “fourth industrial revolution” refers to the computerization of the manufacturing industry.

If the first industrial revolution was the mechanisation of production using water and steam power, followed by the second which introduced mass production with the help of electric power, then the third is what we are currently experiencing: the digital revolution and the use of electronics and IT to further automate production.

The fourth industrial revolution, which is sometimes referred to as Industry 4.0, is the vision of the ‘smart factory’, where cyber-physical systems monitor physical processes, create a virtual copy of the physical world and make decentralized decisions.

These cyber-physical systems communicate and cooperate with each other and humans in real time over the Internet of Things.

Dan O’Hara, professor of cognitive computing at Goldsmiths, University of London, explained that this fourth industrial revolution will not be the same kind of “hollowing out” of jobs that we saw during the last one.

“It [won’t be] manual labour replaced by automation, but it’ll be the hollowing out of middle-income jobs, medium-skilled jobs,” he said.

“The industries that will be affected the most from a replacement with automation are construction, accounts and transport. But the biggest [industry] of all, remembering this is respective to the US, is retail and sales.”

O’Hara added that many large organisations’ biggest expense is people, who already work alongside intelligent computer systems, and this area is most likely to be affected as companies look to reduce costs.

“Anything that’s working on an AI-based system is bound to be very vulnerable to the replacement by AI as it’s easily automated already,” he said.

However, while AI developments in the retail space could lead to the replacement of jobs, it is also rather promising at the same time.

Mark Bishop, professor of cognitive computing at Goldsmiths, highlighted that AI could save businesses money if it becomes smart enough to determine price variants in company spending, for example, scanning through years of an organisation’s invoice database and detecting the cheapest costs and thus saving on outgoings.

While some worry that AI will take over jobs, others have said that they will replace humans altogether.

John Lewis IT chief Paul Coby said earlier this year that the blending of AI and the IoT in the future could signal the end of civilisation as we know it.

Coby explained that the possibilities are already with us in terms of AI and that we ought to think about how “playing with the demons” could be detrimental to our future.

Apple co-founder Steve Wozniak added to previous comments from Stephen Hawking and Elon Musk with claims that “computers are going to take over from humans”.

Woz made his feelings on AI known during an interview with the Australian Financial Review, and agreed with Hawking and Musk that its potential to surpass humans is worrying.

“Computers are going to take over from humans, no question. Like people including Stephen Hawking and Elon Musk have predicted, I agree that the future is scary and very bad for people,” he said.

Source

ARM Buys Offspark For IoT

February 19, 2015 by  
Filed under Computing

Comments Off on ARM Buys Offspark For IoT

ARM has snaffled up Dutch Internet of Things (IoT) company Offspark.

The move is designed to improve ARM’s security credentials for IoT offerings.

Offspark is the creator of PolarSSL, a widely used protocol for IoT security products, and ARM hopes that the combined companies can offer a one-stop shop for IoT developers.

Krisztian Flautner, ARM’s IoT manager, said: “PolarSSL technology is already deployed by the leading IoT players.

“The fact that those same companies also use ARM Cortex processor and software technologies means we are now able to provide a complete bedrock solution for the industry to innovate from.”

The product will be renamed ARM Mbed TLS, but will remain open source, reports Tech Week Europe.

Paul Bakker, CEO of Offspark, added: “Security is the most fundamental aspect in ensuring people trust IoT technology and that is only possible with a truly tailored solution.

“Together, ARM and Offspark can provide security to the edge of any system and we look forward to working with our partners to help them deliver some exciting new projects.”

Developers will be able to license the technology for commercial use as well as embedding it into future ARM products.

Last week the company released the ARM Cortex-A72 processor, a 64-bit effort offering support for Android 5.x Lollipop and incorporating the big.LITTLE architecture that prioritises jobs to different processor cores based on their computational requirements.

A message on the Offspark website indicates that it has been taken down and redirects to ARM.

Source

ARM Develops IoT For Students

February 3, 2015 by  
Filed under Computing

Comments Off on ARM Develops IoT For Students

ARM has created a course to teach IoT skills to students at University College London (UCL)

The course is designed to encourage graduates in science, technology, engineering and maths (Stem) to seek careers in IT.

The IoT Education Kit will teach students how to use the Mbed IoT operating system to create smartphone apps that control mini-robots or wearable devices.

Students are expected to be interested in building their own IoT business, or joining IoT-focused enterprises like ARM. The course will also try to limit the number of Stem graduates pursuing non-technology careers.

ARM reported statistics from a 2012 study by Oxford Policy and Research revealing how many engineering graduates (36 percent of males, 51 percent of females), technology graduates (44 percent, 53 percent) and computer scientists (64 percent, 66 percent) end up with non-Stem jobs.

The IoT Education Kit will be rolled out by UCL’s Department of Electronics from September 2015, with a week-long module for full-time and continuing professional development students.

The Kit comprises a complete set of teaching materials, Mbed-enabled hardware boards made by Nordic Semiconductor, and software licensed from ARM. A second teaching module for engineering graduates is being developed for 2016.

“Students with strong science and mathematical skills are in demand and we need to make sure they stay in engineering,” said ARM CTO Mike Muller.

“The growth of the IoT gives us a great opportunity to prove to students why our profession is more exciting and sustainable than others.”

UCL professor Izzat Darwazeh also highlighted the importance of Stem skills, saying that “many students are not following through to an engineering career and that is a real risk to our long-term success as a nation of innovators”.

Source

NSA Developing System To Crack Encryption

January 13, 2014 by  
Filed under Computing

Comments Off on NSA Developing System To Crack Encryption

The U.S. National Security Agency is working to develop a computer that could ultimately break most encryption programs, whether they are used to protect other nations’ spying programs or consumers’ bank accounts, according to a report by the Washington Post.

The report, which the newspaper said was based on documents leaked by former NSA contractor Edward Snowden, comes amid continuing controversy over the spy agency’s program to collect the phone records Internet communications of private citizens.

In its report, The Washington Post said that the NSA is trying to develop a so-called “quantum computer” that could be used to break encryption codes used to cloak sensitive information.

Such a computer, which would be able to perform several calculations at once instead of in a single stream, could take years to develop, the newspaper said. In addition to being able to break through the cloaks meant to protect private data, such a computer would have implications for such fields as medicine, the newspaper reported.

The research is part of a $79.7 million research program called “Penetrating Hard Targets,” the newspaper said. Other, non-governmental researchers are also trying to develop quantum computers, and it is not clear whether the NSA program lags the private efforts or is ahead of them.

Snowden, living in Russia with temporary asylum, last year leaked documents he collected while working for the NSA. The United States has charged him with espionage, and more charges could follow.

His disclosures have sparked a debate over how much leeway to give the U.S. government in gathering information to protect Americans from terrorism, and have prompted numerous lawsuits.

Last week, a federal judge ruled that the NSA’s collection of phone call records is lawful, while another judge earlier in December questioned the program’s constitutionality. The issue is now more likely to move before the U.S. Supreme Court.

On Thursday, the editorial board of the New York Times said that the U.S. government should grant Snowden clemency or a plea bargain, given the public value of revelations over the National Security Agency’s vast spying programs.

Source

IBM’s Watson Shows Up For Work

January 2, 2012 by  
Filed under Computing

Comments Off on IBM’s Watson Shows Up For Work

IBM’s Watson supercomputer is about to start work evaluating evidence-based cancer treatment options that can be delivered to the doctors in a matter of seconds for assessment.

IBM and WellPoint, which is Blue Cross Blue Shield’s largest health plan, are developing applications that will essentially turn the Watson computer into an adviser for oncologists at Cedars-Sinai’s Samuel Oschin Comprehensive Cancer Institute in Los Angeles, according to Steve Gold, director of worldwide marketing for IBM Watson Solutions.

Cedars-Sinai’s historical data on cancer as well as its current clinical records will be ingested into an iteration of IBM’s Watson that will reside at WellPoint’s headquarters. The computer will act as a medical data repository on multiple types of cancer. WellPoint will then work with Cedars-Sinai physicians to design and develop applications as well as validate their capabilities.

Dr. M. William Audeh, medical director of the cancer institute, will work closely with WellPoint’s clinical experts to provide advice on how the Watson may be best used in clinical practice to support increased understanding of the evolving body of knowledge on cancer, including emerging therapies not widely known by physicians.

IBM announced earlier this year that healthcare would be the first commercial application for the computer, which defeated two human champions on the popular television game show Jeopardy! in February.

Source…