Thursday, August 23, 2012

Changing SEO Strategies Post-Google Penguin

The Google Penguin algorithm update is the latest spam-fighting wave to crash against the shore of search and it has heralded something of a new dawn for SEO, especially in terms of offsite strategy.

In 2011, Google ran an algorithm update, known as the Panda update. It may sound cute and cuddly, but the Panda update was the first major algorithm update to focus on the quality on of onsite content -- pushing sites with rich content and a great user experience to the top of its search rankings, and relegating low-quality sites to the bottom.

Whilst the Panda algorithm, which is still refreshing 18 months on, is arguably more concerned with onsite factors, its not-too-distant relative, Google Penguin, was designed to focus on the other significant realm of SEO: offsite.

The Google Penguin update has led to the ice cracking under many websites tried and tested offsite strategies, which until now have involved the use of a high quantity of low-quality links.

Google has long advised web users to make sure their content adheres to the "Google Webmaster Guidelines," and the presence of the heavily referred to "Web Spam team" has contributed to the common consensus the days of low-level linkbuilding were numbered. Indeed, with the rise of personalized search and social signals seeming to be an increasing factor in ranking, it could be argued those SEO-ers who have not started to adapt their strategy could be left out in the cold.

It was around mid-March 2012 that webmasters began to receive messages warning of detected unnatural links pointing to domains, and gave advice on how to spot them:

"...artificial or unnatural links pointing to your site which could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass Page Rank or participating in link schemes."

Search personalities such as Google's Matt Cutts and SEOmoz's Rand Fishkin have long spoken against using the black hat techniques of high volume, low-quality linkbuilding, but this was a first from Google, in the sense of the volume of sites contacted.

Over 700,000 messages were sent by the end of Q1 -- more than the whole of 2011 and for many, this represented the beginning of the end for many SEOers current offsite strategies. And if it didn't, the ranking drops and the subsequent Webmaster Tools messages should have.

"We encourage you to make changes to your site so that it meets our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results."

Now, Google's initial messages sent in March 2012 were the equivalent of an organic visibility death sentence, but their hard-lined message was blurred when a new wave of messages were sent in July, only this time sites which hadn't been participating in shady link schemes were receiving them too.

How could a site which had never built any links be receiving such messages?

Matt Cutts was quick on the scene and cleared things up via his Google+ account, insisting the latest batch of messages were not to induce panic, but were sent as a transparency exercise to allow greater clarity as to what Google likes and doesn't like. Another round of Webmaster messages were sent insisting Google would only take action against specific links which contravened their guidelines, and not on the domain as a whole (as they previously did back in March).


Whichever way you look at it, the future is NOT in low-level link building, so start planning and move one step ahead with a new offsite strategy immediately.

Most SEOers reacted by balancing the following two strategies:

1) Identifying possible links that may contravene the Google Webmaster Guidelines, contacting the sites and requesting that these links be removed. Google has de-indexed a substantial portion of "spammy" sites, so removing these links from your site should become a top priority before a site could/should be submitted for a reconsideration request.

Since so many historical links pointing to a site had been de-valued or de-indexed, all previous link equity would have been removed, thus weakening the authority and trust of your site.

How could a severely weakened offsite profile be strong again?

2) Agreeing and implementing a new content-focused strategy built around engagement with relevant communities.

The ideal way of obtaining natural links is by creating content that is so useful/informative/entertaining that it begs to be shared, retweeted, '+1'd, and embedded on blogs that attract the same demographic of users that visit your site. This might also mean some actual natural traffic from your offsite efforts (something low-level linkbuilding does not provide!).

Creating content to be shared is a longer-term strategy than the quick fix of buying low-level links. But a natural strategy should engage the right audience along the way, moving SEO into the realm of more creative marketing. It also means that the great content which has been produced can take on a life of its own and be shared and linked to long after its inception.

Guest blogging, too, is a natural way of building brand recognition. Partnership with popular blogs in your industry means that more people will read your content, and the potential for natural back links, and natural traffic as a result of this, should not be underestimated.

So, Google Penguin has arrived and ruffled the feathers of SEO and offsite strategies. With so much great, unique content being shared such as infographics, widgets and videos, Google deemed now the perfect time to rid its index of low quality sites with low quality content, destabilising many offsite strategies, and penalizing those who implemented them.

For SEO offsite strategy, it means adopting a longer term strategy revolving around unique, high-quality content and utilizing social platforms to drive engagement and exposure. Creativity should be at the heart of offsite strategies moving forward, and if it isn't you may well be left out in the cold... like a Penguin.

Source: Huff Post

Friday, August 3, 2012

Google Cloud vs. Amazon Cloud: How They Stack up

Google's new IaaS cloud boasts strong compute performance but lacks the breadth of features in Amazon Web Services' 4-year-old Elastic Compute Cloud, according to one industry analyst's side-by-side comparison of the services.

Neither company provides details of the silicon chips within its servers, but analyst Chris Gaun from Ideas International (recently acquired by Gartner) has used information in public statements to determine the hardware behind each vendor's cloud. Google has said it uses Intel Sandy Bridge processors and that each unit of its Compute Engine delivers performance matching that of at least a 1.0- to 1.2-GHz 2007 Opteron chip. Other media have reported that Google uses 2.6-GHz processors, which leads Gaun to believe the company has Xeon E5-2670 chips, the only ones on the market at the time of Google's announcement that deliver that level of raw compute power.

Gaun believes Google is running the high-capacity chip across its cloud infrastructure, while Amazon makes it available in certain instance types for Elastic Compute Cloud customers, including in its recently announced high I/O extra large cluster compute offering. "Google seems to be running only the latest and greatest chips on the market, while Amazon has a wide variety of chips for customers to use," Gaun says.

Amazon isn't standing pat either. AWS on Wednesday, for example, announced the ability to set the input/output operations per second (IOPS) in Elastic Block Storage.

There are other differences between Google Compute Engine, which is still in limited preview mode, and Amazon cloud services. AWS has 11 different sizes of compute instances, ranging from small virtual machines with 1.7GB of memory, to extra-large compute clusters with 60.5GB of memory, whereas Google has only four. Google also makes the fiber-optic links between its own data centers available to cloud customers. AWS has a variety of accommodating features in its cloud though, such as the EBS volumes, relational database services, load balancers and others.

The two companies are appealing to different customers, Gaun says. While AWS is targeting technology-reliant businesses that are turning to the cloud to host their websites, databases and storage, Google is focused initially on research and development teams that may have a need for high-performance computing to complete a project, for example. The strategy is seen in the pricing models: AWS offers reserved instance pricing discounts, in which customers agree to use a compute instance for months or even years. Google's cloud is priced by smaller time chunks and therefore aimed at shorter-lived projects.

Gaun says if Google wants to compete in a broader market with Amazon, it will likely have to offer a discounted pricing option for long-term use. That may come in time, Gaun predicts, given that the company's cloud computing offering isn't even generally available yet.

Source: PCW

Google debuts super fast broadband service in Kansas City

Google has kept its long-stated promise of super high-speed Internet access by debuting a new fiber-based Internet service in Kansas City with speeds more than 100 times faster than most U.S. Internet systems.

Google Fiber TV service is priced at $120 a month for a package that includes television channels, one gigabyte per second Internet speeds and one terabyte of cloud storage. For $70 a month, the service is available without the television channels.

Advanced level subscriptions offers the ability to record eight TV shows at one time and store up to 500 hours of high-definition programming on the cloud. The subscriber can use a tablet or smart phone as a voice-activated remote control if desired. The service comes with router and a Nexus 7 tablet that can act as the system’s remote control.

The TV service allows subscribers to search live channels, Netflix, YouTube, recorded shows and tens of thousands of hours of on-demand programming.

“The Internet is a huge positive force, and yet we are at a crossroad,” said Patrick Pichette, Google’s chief financial officer. Internet speeds, he said, have leveled out for broadband since around 2000 and Google will be making it 100 times faster with the new service.

“We will make Kansas City a place where bandwidth flows like water,” Milo Medin, vice president of access services at Google, told the Los Angeles Times.

Google invested in building out fiber in Kansas City, Missouri in 2011 after earlier inviting cities to help identify communities that would be interested taking part in the project. For months, the company has been laying a network of fiber optic cable in the city.

Ron Josey, an analyst at ThinkEquity, told the newspaper that Google has long been frustrated by Internet speeds offered by other providers. A faster online infrastructure, he said, would let Google create more products.

“This is their way of showing, if we offer a better pipeline, look at what we can do on the Web in terms of innovation,” Josey said.

Sameet Sinha, an analyst at B. Riley & Co., said the project hopes to stimulate others to follow Google’s lead. “They want to get the government to notice that higher broadband should be a strategic priority,” he told the newspaper. “Second, it could force cable companies to start offering higher-speed Internet.”

Source: BroadcastEngineering

Google Play Grows Up: New Developer Policies Will Clean Up Google's App Store

Google Play, Google's Android app store, is close to eclipsing Apple's App Store in pure numbers, but there's one area in which it's lacking. The former Android Market has a ton of rogue apps -- including copycat games, spam, and malware.

And now, Google is looking to clean up its act.

The difference between Google's and Apple's app stores is that Apple polices every app submitted to the store, often with a draconian approval process. Google's market, on the other hand, is a free-for-all: anyone can submit apps, and there's virtually no screening process. Google has been keeping an eye on its store, and removing known malicious apps, but now we're looking at a potential crackdown.

In a letter to developers Wednesday, Google outlined a new set of rules, which it hopes will eliminate some of the spammy apps in the Google Play store. Developers have 30 days to comply with the new policies, or risk their app being removed from the store.

The new rules are quite comprehensive. All Google Play apps must use Google's own payment system for downloads or in-app purchases (except for physical goods and goods consumed outside the app). To reduce copycat apps, Google says app developers shouldn't "pretend to be someone else" and may not "represent that [their] app is authorized by or produced by another company or organization if that is not the case." Apps also should not have names or icons too similar to apps that ship with Android.

Google also gets specific in the new rules about not transmitting viruses, worms, Trojan horses, and malware, as well as about misleading product descriptions, repetitive content, ratings gaming, and apps that send automated SMS and email messages. There are also new rules regarding suspicious ad practices in apps: developers can no longer make ads look like system notifications or collect personal user data.

Google's new developer policies for the Google Play store show that the company is finally looking to mature its app marketplace. This makes sense, since Google Play is now similar in size to Apple's App Store, with 600,000 apps to Apple's 650,000. Even though studies indicate that a majority of Apple App Store apps are never downloaded, Apple undoubtedly has the upper hand when it comes to app quality. With its latest move, Google could be closing in soon.

Source: PCW

Flash News

Flash News