Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Will Google’s Algorithm Stop Piracy?

October 30, 2014 by  
Filed under Computing

Comments Off on Will Google’s Algorithm Stop Piracy?

Nosey Google has updated its search engine algorithms in an attempt to restrict piracy web sites appearing high in its search rankings.

The update will mean piracy sites are less likely to appear when people search for music, films and other copyrighted content.

The decision to roll out the search changes was announced in a refreshed version of a How Google Fights Piracy report, which was originally published in September 2013.

However, this year’s updated report features a couple of developments, including changes to ad formats and an improved DMCA demotion search signal.

The move is likely to be a result of criticism received from the entertainment industry, which has argued that illegal sites should be “demoted” in search results because they enable people to find sites to download media illegally.

The biggest change in the Google search update will be new ad formats in search results on queries related to music and movies that help people find legitimate sources of media.

For example, for the relatively small number of queries for movies that include terms like ‘download’, ‘free’, or ‘watch’, Google has instead begun listing legal services such as Spotify and Netflix in a box at the top of the search results.

“We’re also testing other ways of pointing people to legitimate sources of music and movies, including in the right-hand panel on the results page,” Google added.

“These results show in the US only, but we plan to continue investing in this area and to expand it internationally.”

An improved DMCA demotion signal in Google search is also being rolled out as part of the refresh, which down-ranks sites for which Google has received a large number of valid DMCA notices.

“We’ve now refined the signal in ways we expect to visibly affect the rankings of some of the most notorious sites. This update will roll out globally starting next week,” Google said, adding that it will also be removing more terms from autocomplete, based on DMCA removal notices.

The new measures might be welcomed by the entertainment industry, but are likely to encourage more people to use legal alternatives such as Spotify and Netflix, rather than buying more physical media.

Source

The FCC Extends Deadline

August 25, 2014 by  
Filed under Around The Net

Comments Off on The FCC Extends Deadline

U.S. Federal Communications Commission has said it would accept public comments on its proposed new “net neutrality” rules through Sept. 15, giving the American public extra time to voice their opinions and concerns on how they think Internet traffic should be regulated.

The FCC has received more than 1 million comments already on new rules for how Internet services providers should be allowed to manage web traffic on their networks.

The FCC had set a deadline of July 15 for the initial comments and then September 10 for replies to those initial comments. However, the surge in submissions overwhelmed the FCC’s website and the agency had delayed the first deadline by three business days.

“To ensure that members of the public have as much time as was initially anticipated to reply to initial comments in these proceedings, the Bureau today is extending the reply comment deadline by three business days,” the FCC said on Friday, delaying the final deadline for comments to September 15.

Source

OpenSSL Gets Updated

August 20, 2014 by  
Filed under Security

Comments Off on OpenSSL Gets Updated

OPENSSL, the web security layer at the center of the Heartbleed vulnerability, has been issued with a further nine critical patches.

While none are as serious as Heartbleed, patching is recommended for all users according to an advisory released today. The vulnerabilities stem from various security research teams around the web including Google, Logmein and Codenomicom, based on their reports during June and July of this year.

Among the more interesting fixes involves a flaw in the ClientHello message process. If a ClientHello message is badly fragmented, it is vulnerable to a man-in-the-middle attack which could be used to force the server to downgrade itself to the TLS 1.0 protocol, a fifteen year old and therefore pre-Heartbleed patch variant.

Other reports include memory leaks caused by denial of service attacks (DoS) and conversely, crashes caused by an attempt to free up the same portions of memory twice.

OpenSSL now has two full time coders as a result of investment by a consortium of Internet industry companies to form the Core Infrastructure Initiative, a not-for-profit group administered by the Linux Foundation. The Initiative was set up in the wake of Heartbleed, as the industry vowed to ensure such a large hole would never be left unplugged again.

While OpenSSL is used by a large number of encrypted sites, there are a number of forks of the project including LibreSSL and the recently launched Google BoringSSL.

Google recently announced that it would be lowering the page rankings of unencrypted pages in its search results as an added security measure.

Source

Many Websites Still Exposed

May 9, 2014 by  
Filed under Security

Comments Off on Many Websites Still Exposed

The world’s top 1,000 websites have been updated to protect their servers against the “Heartbleed” vulnerability, but up to 2% of the top million remained unprotected as of last week, according to a California security firm.

On Thursday, Menifee, Calif.-based Sucuri Security scanned the top 1 million websites as ranked by Alexa Internet, a subsidiary of Amazon that collects Web traffic data.

Of the top 1,000 Alexa sites, all were either immune or had been patched with the newest OpenSSL libraries, confirmed Daniel Cid, Sucuri’s chief technology officer, in a Sunday email.

Heartbleed, the nickname for the flaw in OpenSSL, an open-source cryptographic library that enables SSL (Secure Sockets Layer) or TLS (Transport Security Layer) encryption, was discovered independently by Neel Mehta, a Google security engineer, and researchers from security firm Codenomicon earlier this month.

The bug had been introduced in OpenSSL in late 2011.

Because of OpenSSL’s widespread use by websites — many relied on it to encrypt traffic between their servers and customers — and the very stealthy nature of its exploit, security experts worried that cyber criminals either had, or could, capture usernames, passwords,\ and even encryption keys used by site servers.

The OpenSSL project issued a patch for the bug on April 7, setting off a rush to patch the software on servers and in some client operating systems.

The vast majority of vulnerable servers had been patched as of April 17, Sucuri said in a blog postthat day.

While all of the top 1,000 sites ranked by Alexa were immune to the exploit by then, as Sucuri went down the list and scanned smaller sites, it found an increasing number still vulnerable. Of the top 10,000, 0.53% were vulnerable, as were 1.5% of the top 100,000 and 2% of the top 1 million.

Other scans found similar percentages of websites open to attack: On Friday, San Diego-based Websense said about 1.6% of the top 50,000 sites as ranked by Alexa remained vulnerable.

Since it’s conceivable that some sites’ encryption keys have been compromised, security experts urged website owners to obtain new SSL certificates and keys, and advised users to be wary of browsing to sites that had not done so.

Sucuri’s scan did not examine sites to see whether they had been reissued new certificates, but Cid said that another swing through the Web, perhaps this week, would. “I bet the results will be much much worse on that one,” Cid said.

Source