Sunday, June 30, 2013

Facebook moves to remove ads that display controversial content



 Facebook is launching an aggressive strategy for better detecting violent, graphic, sexual, and otherwise controversial content across its site and for removing ads that appear alongside that content.

The changes follow other recent developments involving hate speech appearing on the site, which Facebook has vowed to better combat, though that has not stopped some marketers from pulling their ads in response.

The new detection-and-removal policy, which was announced Friday in a blog post, is designed to provide Facebook with a better mechanism for removing ads that appear alongside certain types of questionable content on Groups and Pages.

“While we already have rigorous review and removal policies for content against our terms, we recognize we need to do more to prevent situations where ads are displayed alongside controversial Pages and Groups,” the company said.

“So we are taking action,” Facebook added.
Details

The new review process, beginning Monday, “will expand the scope of Pages and Groups that should be ad-restricted,” the company said. Ads from all Pages and Groups that fall into this more comprehensive restricted list will be removed by the end of next week.

Previously, a Page selling adult products was eligible to have ads appear on its right-hand side but, going forward, those ads will not be displayed next to that type of content, Facebook said. The changes will be applied to Pages and Groups containing violent, graphic, and sexual content that does not otherwise violate the company’s community standards.

 The way Facebook classifies what is offensive content and what is not is complicated. In terms of graphic content, “we understand that graphic imagery is a regular component of current events, but must balance the needs of a diverse community,” the site says in its community standards.

For instance, “sharing any graphic content for sadistic pleasure is prohibited,” the site says.
Hate speech policies

Facebook introduced new policies to combat hate speech on the site last month, following the campaign of several high-profile women’s groups including Women, Action and the Media, and the Everyday Sexism Project.

Around the same time, some big-name brands like Nissan and Unilever’s Dove company pulled ads on the site.

The review process will be carried out manually by humans at first, “but in the coming weeks we will build a more scalable, automated way to prevent and/or remove ads” that appear next to controversial content, Facebook said.

“All this will improve detection of what qualifies as questionable content,” the site said, adding, “we will continue to work aggressively on this issue with advertisers.”

The changes will not impact Facebook’s business, the company said.

Facebook’s revenue is derived almost entirely—84 percent in 2012—from ads.

Other ad-dependent companies are also grappling with how to deal with questionable content. Google has recently made moves to remove adult-themed blogs on its Blogger platform that also have adult advertisements.


Thursday, June 27, 2013

New Google Play Edition devices have a new camera app, drawer, and wallpaper



The new HTC One and Samsung Galaxy S 4 with stock Android have been released, and with their version of Android 4.2.2 comes a few unique touches to the UI. The camera app, which was rumored to get an update in 4.3, has a new stacked menu system for easier navigation. Another big change is that the app drawer has been switched from a 4×4 layout to a 4×5 layout to take advantage of the different screen sizes of the One and S4.

Other minor changes are the addition of a new red Phase Beam wallpaper to add to the purple and blue versions, and a new custom boot animation. The lockscreen clock on the S4 is a little different from Nexus devices, probably so that the phone correctly works with Samsung’s S-View flip covers.

Source: Computer World

Sunday, June 23, 2013

Facebook security bug exposes 6 million users' contact info



Facebook accidentally exposed 6 million users’ contact information. Watch out for an email alert from the network to find out if you were affected by an apparent security bug.

The bug allowed the emails and phone numbers of some 6 million users to be accessed by contacts or friends of friends as part of the site’s friend recommendation algorithm, the social network’s security team said Friday.

If you upload your contacts or address book to Facebook in order to find friends, Facebook uses that information to determine if your friends are already on the network or if you should invite them to join. That contact information may have been included in account archive information that users can download. In other words, people who have some connection to you may have been able to view your contact information when they downloaded their archive. Facebook said it disabled the Download Your Information tool, fixed it, and turned it back on within a day.

Facebook’s security team said each affected user’s information was downloaded just once or twice, which is a small consolation. The company also noted that no financial information was included and only Facebook users have access to the download tool (so information was probably not sold to advertisers).

“Although the practical impact of this bug is likely to be minimal since any email address or phone number that was shared was shared with people who already had some of that contact information anyway, or who had some connection to one another, it's still something we're upset and embarrassed by, and we'll work doubly hard to make sure nothing like this happens again,” Facebook said in its Friday announcement.

This isn’t the first time Facebook users’ personal information has been exposed. Facebook in 2011 introduced a White Hat bug bounty program, where security experts can file reports about bugs and collect rewards. This most recent bug was discovered by one such researcher.

Thursday, August 23, 2012

Changing SEO Strategies Post-Google Penguin

The Google Penguin algorithm update is the latest spam-fighting wave to crash against the shore of search and it has heralded something of a new dawn for SEO, especially in terms of offsite strategy.

In 2011, Google ran an algorithm update, known as the Panda update. It may sound cute and cuddly, but the Panda update was the first major algorithm update to focus on the quality on of onsite content -- pushing sites with rich content and a great user experience to the top of its search rankings, and relegating low-quality sites to the bottom.

Whilst the Panda algorithm, which is still refreshing 18 months on, is arguably more concerned with onsite factors, its not-too-distant relative, Google Penguin, was designed to focus on the other significant realm of SEO: offsite.

The Google Penguin update has led to the ice cracking under many websites tried and tested offsite strategies, which until now have involved the use of a high quantity of low-quality links.

Google has long advised web users to make sure their content adheres to the "Google Webmaster Guidelines," and the presence of the heavily referred to "Web Spam team" has contributed to the common consensus the days of low-level linkbuilding were numbered. Indeed, with the rise of personalized search and social signals seeming to be an increasing factor in ranking, it could be argued those SEO-ers who have not started to adapt their strategy could be left out in the cold.

It was around mid-March 2012 that webmasters began to receive messages warning of detected unnatural links pointing to domains, and gave advice on how to spot them:

"...artificial or unnatural links pointing to your site which could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass Page Rank or participating in link schemes."

Search personalities such as Google's Matt Cutts and SEOmoz's Rand Fishkin have long spoken against using the black hat techniques of high volume, low-quality linkbuilding, but this was a first from Google, in the sense of the volume of sites contacted.

Over 700,000 messages were sent by the end of Q1 -- more than the whole of 2011 and for many, this represented the beginning of the end for many SEOers current offsite strategies. And if it didn't, the ranking drops and the subsequent Webmaster Tools messages should have.

"We encourage you to make changes to your site so that it meets our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results."

Now, Google's initial messages sent in March 2012 were the equivalent of an organic visibility death sentence, but their hard-lined message was blurred when a new wave of messages were sent in July, only this time sites which hadn't been participating in shady link schemes were receiving them too.

How could a site which had never built any links be receiving such messages?

Matt Cutts was quick on the scene and cleared things up via his Google+ account, insisting the latest batch of messages were not to induce panic, but were sent as a transparency exercise to allow greater clarity as to what Google likes and doesn't like. Another round of Webmaster messages were sent insisting Google would only take action against specific links which contravened their guidelines, and not on the domain as a whole (as they previously did back in March).

Confused?

Whichever way you look at it, the future is NOT in low-level link building, so start planning and move one step ahead with a new offsite strategy immediately.

Most SEOers reacted by balancing the following two strategies:

1) Identifying possible links that may contravene the Google Webmaster Guidelines, contacting the sites and requesting that these links be removed. Google has de-indexed a substantial portion of "spammy" sites, so removing these links from your site should become a top priority before a site could/should be submitted for a reconsideration request.

Since so many historical links pointing to a site had been de-valued or de-indexed, all previous link equity would have been removed, thus weakening the authority and trust of your site.

How could a severely weakened offsite profile be strong again?

2) Agreeing and implementing a new content-focused strategy built around engagement with relevant communities.

The ideal way of obtaining natural links is by creating content that is so useful/informative/entertaining that it begs to be shared, retweeted, '+1'd, and embedded on blogs that attract the same demographic of users that visit your site. This might also mean some actual natural traffic from your offsite efforts (something low-level linkbuilding does not provide!).

Creating content to be shared is a longer-term strategy than the quick fix of buying low-level links. But a natural strategy should engage the right audience along the way, moving SEO into the realm of more creative marketing. It also means that the great content which has been produced can take on a life of its own and be shared and linked to long after its inception.

Guest blogging, too, is a natural way of building brand recognition. Partnership with popular blogs in your industry means that more people will read your content, and the potential for natural back links, and natural traffic as a result of this, should not be underestimated.

So, Google Penguin has arrived and ruffled the feathers of SEO and offsite strategies. With so much great, unique content being shared such as infographics, widgets and videos, Google deemed now the perfect time to rid its index of low quality sites with low quality content, destabilising many offsite strategies, and penalizing those who implemented them.

For SEO offsite strategy, it means adopting a longer term strategy revolving around unique, high-quality content and utilizing social platforms to drive engagement and exposure. Creativity should be at the heart of offsite strategies moving forward, and if it isn't you may well be left out in the cold... like a Penguin.

Source: Huff Post

Friday, August 3, 2012

Google Cloud vs. Amazon Cloud: How They Stack up

Google's new IaaS cloud boasts strong compute performance but lacks the breadth of features in Amazon Web Services' 4-year-old Elastic Compute Cloud, according to one industry analyst's side-by-side comparison of the services.

Neither company provides details of the silicon chips within its servers, but analyst Chris Gaun from Ideas International (recently acquired by Gartner) has used information in public statements to determine the hardware behind each vendor's cloud. Google has said it uses Intel Sandy Bridge processors and that each unit of its Compute Engine delivers performance matching that of at least a 1.0- to 1.2-GHz 2007 Opteron chip. Other media have reported that Google uses 2.6-GHz processors, which leads Gaun to believe the company has Xeon E5-2670 chips, the only ones on the market at the time of Google's announcement that deliver that level of raw compute power.

Gaun believes Google is running the high-capacity chip across its cloud infrastructure, while Amazon makes it available in certain instance types for Elastic Compute Cloud customers, including in its recently announced high I/O extra large cluster compute offering. "Google seems to be running only the latest and greatest chips on the market, while Amazon has a wide variety of chips for customers to use," Gaun says.

Amazon isn't standing pat either. AWS on Wednesday, for example, announced the ability to set the input/output operations per second (IOPS) in Elastic Block Storage.

There are other differences between Google Compute Engine, which is still in limited preview mode, and Amazon cloud services. AWS has 11 different sizes of compute instances, ranging from small virtual machines with 1.7GB of memory, to extra-large compute clusters with 60.5GB of memory, whereas Google has only four. Google also makes the fiber-optic links between its own data centers available to cloud customers. AWS has a variety of accommodating features in its cloud though, such as the EBS volumes, relational database services, load balancers and others.

The two companies are appealing to different customers, Gaun says. While AWS is targeting technology-reliant businesses that are turning to the cloud to host their websites, databases and storage, Google is focused initially on research and development teams that may have a need for high-performance computing to complete a project, for example. The strategy is seen in the pricing models: AWS offers reserved instance pricing discounts, in which customers agree to use a compute instance for months or even years. Google's cloud is priced by smaller time chunks and therefore aimed at shorter-lived projects.

Gaun says if Google wants to compete in a broader market with Amazon, it will likely have to offer a discounted pricing option for long-term use. That may come in time, Gaun predicts, given that the company's cloud computing offering isn't even generally available yet.

Source: PCW

Google debuts super fast broadband service in Kansas City

Google has kept its long-stated promise of super high-speed Internet access by debuting a new fiber-based Internet service in Kansas City with speeds more than 100 times faster than most U.S. Internet systems.

Google Fiber TV service is priced at $120 a month for a package that includes television channels, one gigabyte per second Internet speeds and one terabyte of cloud storage. For $70 a month, the service is available without the television channels.

Advanced level subscriptions offers the ability to record eight TV shows at one time and store up to 500 hours of high-definition programming on the cloud. The subscriber can use a tablet or smart phone as a voice-activated remote control if desired. The service comes with router and a Nexus 7 tablet that can act as the system’s remote control.

The TV service allows subscribers to search live channels, Netflix, YouTube, recorded shows and tens of thousands of hours of on-demand programming.

“The Internet is a huge positive force, and yet we are at a crossroad,” said Patrick Pichette, Google’s chief financial officer. Internet speeds, he said, have leveled out for broadband since around 2000 and Google will be making it 100 times faster with the new service.

“We will make Kansas City a place where bandwidth flows like water,” Milo Medin, vice president of access services at Google, told the Los Angeles Times.

Google invested in building out fiber in Kansas City, Missouri in 2011 after earlier inviting cities to help identify communities that would be interested taking part in the project. For months, the company has been laying a network of fiber optic cable in the city.

Ron Josey, an analyst at ThinkEquity, told the newspaper that Google has long been frustrated by Internet speeds offered by other providers. A faster online infrastructure, he said, would let Google create more products.

“This is their way of showing, if we offer a better pipeline, look at what we can do on the Web in terms of innovation,” Josey said.

Sameet Sinha, an analyst at B. Riley & Co., said the project hopes to stimulate others to follow Google’s lead. “They want to get the government to notice that higher broadband should be a strategic priority,” he told the newspaper. “Second, it could force cable companies to start offering higher-speed Internet.”

Source: BroadcastEngineering

Google Play Grows Up: New Developer Policies Will Clean Up Google's App Store

Google Play, Google's Android app store, is close to eclipsing Apple's App Store in pure numbers, but there's one area in which it's lacking. The former Android Market has a ton of rogue apps -- including copycat games, spam, and malware.

And now, Google is looking to clean up its act.

The difference between Google's and Apple's app stores is that Apple polices every app submitted to the store, often with a draconian approval process. Google's market, on the other hand, is a free-for-all: anyone can submit apps, and there's virtually no screening process. Google has been keeping an eye on its store, and removing known malicious apps, but now we're looking at a potential crackdown.

In a letter to developers Wednesday, Google outlined a new set of rules, which it hopes will eliminate some of the spammy apps in the Google Play store. Developers have 30 days to comply with the new policies, or risk their app being removed from the store.

The new rules are quite comprehensive. All Google Play apps must use Google's own payment system for downloads or in-app purchases (except for physical goods and goods consumed outside the app). To reduce copycat apps, Google says app developers shouldn't "pretend to be someone else" and may not "represent that [their] app is authorized by or produced by another company or organization if that is not the case." Apps also should not have names or icons too similar to apps that ship with Android.

Google also gets specific in the new rules about not transmitting viruses, worms, Trojan horses, and malware, as well as about misleading product descriptions, repetitive content, ratings gaming, and apps that send automated SMS and email messages. There are also new rules regarding suspicious ad practices in apps: developers can no longer make ads look like system notifications or collect personal user data.

Google's new developer policies for the Google Play store show that the company is finally looking to mature its app marketplace. This makes sense, since Google Play is now similar in size to Apple's App Store, with 600,000 apps to Apple's 650,000. Even though studies indicate that a majority of Apple App Store apps are never downloaded, Apple undoubtedly has the upper hand when it comes to app quality. With its latest move, Google could be closing in soon.

Source: PCW

Flash News

Flash News