Showing posts with label Facebook. Show all posts

WhatsApp, Bought by Facebook for $19 Billion.



The surprising tech news of the week is Facebook's $19 billion acquisition of WhatsApp, a fast-growing mobile-messaging startup. WhatsApp allows smartphone users to evade phone company limits and send unlimited text messages. The service is free for the first year, and a buck for each following year. Five years after its creation by Jan Koum, a Ukrainian immigrant who dropped out of college, and Brian Acton, a Stanford alum, the app has 450 million users—most outside the United States—and a million new users signing up each day. The eye-popping price tag—about one-tenth the entire value of Facebook—is the shocker that's drawn much media notice. But there's another element to the story that is astounding: Koum and Acton have published a manifesto that radically critiques the foundation of modern capitalism—advertising—and denounces materialism. Facebook's business model, of course, depends upon both.

Keep visiting us for more latest tech news.
Please comment if you have questions or suggestions.




Facebook Now Allows Friends To Promote Posts, But Privacy Concerns Arise

On average, only 16 percent of your friends see your Facebook posts. However, with promoted posts, one of Facebook’s newest features, more of your friends will be likely to see your posts than ever before, as it will allow users to pay to promote their friends’ posts.

Facebook hopes that quality content that’s supported by you and your friends will be featured at the top of your news feed as a result of this new feature. Some users have already received the new feature, but the gradual roll out will continue among Facebook users with fewer than 5,000 total friends and subscribers.


Facebook’s new service could be described as a paid version of Reddit, a free website for which users vote other users’ submissions “up” or “down” based on whether they liked them or not. The challenge for Facebook is that with free -- and immensely popular -- alternatives like Reddit, many users may not want to pay to promote their friends’ posts.

Facebook began testing the concept of users promoting their own posts last May, and it officially rolled out the feature to the public in October. Promoted posts usually cost about $7 or so -- and the price varies based on location and how many people the post can potentially reach. However,  the feature allows users to promote others' promoted posts, which will help users reach a wider audience than they would without the service -- and potentially help Facebook’s bottom line as well.

However, the new promoted posts service could increase the potential for cyber bullying.

This potential problem could occur when someone posts, say, a photo on Facebook, and then that user’s friend decides to promote that photo. The problem is that the friend doesn't need the original user’s explicit permission to promote the photo.

This could be a problem, for example, if one of your friends from college decides to promote an old embarrassing photo of you -- you won’t be able to prevent the picture from getting to the top of the News Feed for a large percentage of your friends.

This could lead to public shaming, which is considered a form of cyber bullying.

It’s a problem that Facebook has addressed in the past. The site has been proactively teaching users about cyber bullying and it partnered with several anti-bullying organizations after a user named Amanda Todd committed suicide after being bullied on Facebook, which reignited the charge that Facebook facilitates cyber-bullying happened more than other social networks.

However, promoted posts might offer another way for cyber bullies to pick on their victims: By paying to promote embarrassing or personal content to show up at the top of the News Feed, Facebook could encourage users to embarrass other users for a small fee.

In addition, there’s no way to determine who promoted a post, so, as TechCrunch notes, a friend promoting an article written by you could be perceived as a shameless plug by you to get more of your friends to check out your article -- even if you had nothing to with the promoted post.

However, Facebook is quick to point out that there are a lot of benefits to the service. A Facebook spokesperson released a statement to AllFacebook on Friday, describing the feature thusly:

“If your friend is running a marathon for charity and has posted that information publicly, you can help that friend by promoting their post to all of your friends,” the post said. “Or if your friend is renting their apartment out and she tells her friends on Facebook, you can share the post with the people you and your friend have in common so that it shows up higher in news feed and more people notice it.”

Facebook will likely monitor this new feature to see how it’s used. It has the potential to be make Facebook into a more Reddit-like site, where popular posts dominate the front page on a merit basis -- but it could be used maliciously to embarrass or play pranks on friends -- only time will tell.

Facebook is planning to emphasize its mobile platform in 2013, and it will be interesting to see if the Promoted Posts feature will be included in its mobile plans. In the company’s most recent earnings report, Facebook said mobile ad revenue accounts for 23 percent of its total ad revenue.

International Business Times contacted Facebook to ask if any safeguards will be put in place to prevent cyberbullying through the new service-- and how the site plans to address potential complaints about it -- but it didn’t get back to us by the time this article was published.

Facebook lets users promote friends' posts

The move expands upon the personal promoted posts feature launched last October.

Facebook has expanded its promoted posts program, allowing users to pay to highlight posts made by friends. The move, if successful, could help boost fees Facebook collects from the recently introduced service.

The new feature lets users pay to push posts made by friends to the top of the news feed of people with whom the post was originally shared. Previously, users could only promote their own posts, typically for $7 a pop in the U.S.

Under the changes, people can now promote any friend's post, such as status updates, photos or videos. Facebook says the new function respects privacy because the post is only promoted to the same users that would have originally been able to see it.

"It simply helps to ensure that important posts are noticed by more people," said Gwendolyn Belomy, a spokeswoman for Facebook. It began gradually rolling out globally starting Thursday to people with fewer than 5,000 friends or subscribers.

The social network launched the promoted posts program in October 2012. Fees the social network has collected from the tool thus far "have not been material," Facebook said in its annual report filed Feb. 1 with the U.S. Securities and Exchange Commission.

Charity donations, fundraising and publicizing events are just some uses of the new functionality that Facebook is highlighting.

"If your friend is running a marathon for charity and has posted that information publicly, you can help that friend by promoting their post to all of your friends," the company said.

The tool can also make the apartment hunting process easier, the social network said. If a friend is renting out her apartment, for instance, and tells her friends on Facebook, someone else can share that post with both users' mutual friends so that it shows up higher in the news feed.

Other uses may include promoting a friend's announcement about a move or a new job, the company said.

The service will cost the same as promoting your own posts, which will vary depending on the user's geographic location and the size of the person's network.

New iOS Facebook App Gives you free calling....!!

You all have used Voice call over Google talk but now it's also available on your favorate facebook App for iOS(Compatible with iPhone, iPod touch, and iPad. Requires iOS 4.3 or later) in Latest version 2.1.1.

The new function, which is rolling out to the Facebook Messenger iOS app, requires Messenger users to open a conversation with another iPhone owner, tap the "i" button in the top right corner, and press "Free Call."

The recipient will then receive a pop-up notification that says, for example, "Iliyas Mansuree is calling."

What's New in Version 2.1.1
- Send a quick voice message when you have more to say
- Call friends for free right from Messenger*

*Free calling uses your existing data plan, and will be rolling out over the next few weeks.

New iOS Facebook App Gives you free calling
New iOS Facebook App Gives you free calling
New iOS Facebook App Gives you free calling
Call Kate Freeman

Like and share the post on Facebook, Twitter and Google + with your friends and followers.

FACEBOOK NEW CHAT EMOTICONS

FACEBOOK NEW CHAT EMOTICONS


Hi Friends today i am gonna share some new Facebook Emoticons (Smileys) which you might have never used.

All you need to do is just type the code in chat window for respective smiley and here you go !!



Facebook New Chat Emoticons
Facebook New Chat Emoticons

Type these codes in Facebook Chat Window codes to send Facebook new chat emoticons : 


***MISCELLANEOUS***



[[165367383561291]]
[[127208147390558]]
[[127283884049297]]
[[226129040788867]]
[[292334984130530]]
[[198624010220248]]
[[123024957810430]]
[[281444468564228]]
[[242808105783873]]
[[193466394072772]]
[[171108522930776]]
[[164413893600463]]
[[218595638164996]]
[[189637151067601]]
[[129627277060203]]
[[227644903931785]]
[[100002752520227]]
[[105387672833401]]
[[100002727365206]]
[[224812970902314]]

***INTERNET RELATED***



[[136446926442912]] = friendster
[[googlechrome]] = google chrome
[[2231777543]] = twitter
[[87741124305]] = old youtube logo
[[2513891999]] = new youtube logo

 ***OTHERS***



[[297354436976262]] = Santa Claus
[[TheMagicOfSantaClaus]] = Santa Claus 2
[[111751548879681]] = Adidas
[[DJPAULYD]] = DJ
[[9gags]] = 9gag logo
[[FapFapFapMeme]] = Fap Fap Fap
[[106043532814443]] = Y U NO Guy
[[211782832186415]] = Me Gusta
[[142670085793927]] = Mother of God
[[170815706323196]] = Cereal Guy
[[168456309878025]] = LOL Face
[[167359756658519]] = NO Guy
[[218595638164996]] = Yao Ming
[[224812970902314]] = Derp
[[192644604154319]] = Derpina
[[177903015598419]] = Forever Alone
[[105387672833401]] = Fuck yeah
[[100002727365206]] = Challang accepted
[[100002752520227]] = Okay face
[[129627277060203]] = Poker face
[[224812970902314]] = Okay face
[[98438140742]] = Socially awkward penguin
[[FUUUOFFICIAL]] = Rage face
[[168040846586189]] = Feel like a sir
[[125038607580286]] = Forever alone christmas

***CELEBRITIES*** 



[[NotBaad]] = Obama
[[barackobama]] = Obama 2
[[254055957957388]] = Jackie Chan
[[218595638164996]] = Yao Ming
[[246631252031491]] = Ryan Gosling
[[VinDiesel]] = Vin Diesel
[[CharlieSheen]] = Charlie Sheen
[[theuncrunched]] = Loudmouth
[[DonaldTrump]] = Greed
[[123670240998921]] = David Hasselhoff
[[Zuck]] = Brilliance
[[simoncowell]] = Condescension, judgment
[[46637413257]] = Badass
[[WilliamShakespeare1]] = Shakespeare
[[CaptainJackSparrow]] = Pirate

 ***CARTOONS***



 [[334954663181745]] = Spongebob
[[100001755689032]] = Squirtle
[[148935948523684]] = Pedobear
[[120219704713360]] = Sonic
[[123363421035031]] = Pooh
[[132045620187428]] = Piglet
[[147290738648754]] = Ultraman
[[40134995667]] = Ichigo
[[100001076048283]] = shin-chan
[[196431117116365]] = Shin chan 2
[[250128751720149]] = domo
[[326134990738733]] = Pikachu
[[155393057897143]] = Doraemon
[[224502284290679]] = Nobita
[[144685078974802]] = Mojacko
[[236147243124900]] = Pokeball
[[269153023141273]] = Poring
[[332936966718584]] = Hello Kitty
[[281009981935146]] = Angry Birds
[[252497564817075]] = Kerokeroppi
[[249199828481201]] = Konata Izumi
[[223328504409723]] = Gintoki Sakata
[[278104690058]] = Disapproval


***MUSIC BANDS***



[[32933472565]] = AC/DC
[[JustinBieber]] = Justin Beiber
[[Nickelback]] = Nickelback
[[131572223581891]] = Led Zeppelin
[[17337462361]] = Queen
[[70307378227]] = The Beatles
[[10225487330]] = The Who
[[10212595263]] = Metallica
[[10901008068]] = Guns n' Roses
[[13141232713]] = Pink Floyd
[[PearlJam]] = Pearl Jam!
[[133580073375696]] = Avenged Sevenfold
[[123241807694952]] = Killswitch Engage
[[49269768301]] = Rolling Stones!
[[32043114034]] = Nirvana
[[115875091774774]] = Megadeth
[[15292660169]] = Overkill
[[BobMarley]] = Bob Marley

***ALPHABET***



[[196920740401785]] - A
[[113544575430999]] - B
[[294715893904555]] - C
[[294660140569858]] - D
[[328415510520892]] - E
[[270221906368791]] - F
[[212614922155016]] - G
[[205633882856736]] - H
[[256255337773105]] - I
[[288138264570038]] - J
[[296999947008863]] - K
[[216672855078917]] - L
[[278786215503631]] - M
[[241341589270741]] - N
[[312524205448755]] - O
[[200138403410055]] - P
[[165410113558613]] - Q
[[203403609746433]] - R
[[334427926570136]] - S
[[250632158335643]] - T
[[285985351447161]] - U
[[343627398996642]] - V
[[315740851791114]] - W
[[136342506479536]] - X
[[224173507657194]] - Y
[[317710424919150]] - Z

Facebook Chat
Facebook Chat


Happy Chatting and Merry Christmas.. :)
Comment and like if you enjoyed this Article ... :)

How to get hidden email address from Facebook friends

How to get hidden email address from Facebook friends
If you need to know what are the email addresses of your Facebook friends then it will be a useful trick for you. We will use uk.yahoo.com for this trick.

Steps: 

Step 1. Go to uk.yahoo.com and click on sign up to create a new account.

uk.yahoo.com
uk.yahoo.com

Step 2. Once you fill the  form and your account been created, then you are ready to perform the step number 3. Remember that do not go to Yahoo India version. Fill the form and continue as per shown in below form.

uk.yahoo.com Sign Up !
uk.yahoo.com Sign Up !

Step3. Sign in to your yahoo.co.uk account and then go to contacts and click on import contacts. The select import from Facebook.
Import Contacts
Import Contacts
Step 4. Click on Facebook logo to import contacts from Facebook.

Import Contacts From Facebook
Import Contacts From Facebook

Step 5. Click OK on the next screen where it says share with Yahoo !

Share with Yahoo !
Share with Yahoo !
That's all ! You have successfully imported all the contacts from Facebook. Now go to contacts and search with name you will find that person's email address there. 
Enjoy !! Hit like button if you enjoyed reading this tutorial.

Can Science Make Facebook More Compassionate?

Facebook is confronting cyberbullying and online conflict. Can a team of researchers help boost kindness among the site's 900 million users?
________________________________________
Eighteen months ago, Arturo Bejar and some colleagues at Facebook were reviewing photos on the site that users had flagged as inappropriate. They were surprised by the offending content—because it seemed so benign.
“People hugging each other, smiling for the camera, people making goofy faces—I mean, you could look at the photographs and you couldn’t tell at all that there was something to make somebody upset,” says Bejar, a director of engineering at the social networking site.

facebook
Then, while studying a photo, one of his colleagues realized something: The person who reported the photo was actually in the photo, and the person who posted the photo was their friend.
As the team scrolled through the images, they noticed that was true in the vast majority of cases: Most of the issues involved people who knew each other but apparently didn’t know how to resolve a problem between them.
Someone would be bothered by a photo of an ex-boyfriend or ex-girlfriend, for instance, or would be upset because they were excluded from a photo that showed a friend’s “besties.” Often people didn’t like that their kids were in a photo a relative had uploaded. And sometimes they just didn’t like the way they looked.
Facebook didn’t have ways to identify or analyze these problems, let alone resolve them. And that made Bejar and his colleagues feel like they weren’t adequately serving the Facebook community—a concern amplified by the site’s exponential growth and worries about cyberbullying among its youngest users.
“When you want to support a community of a billion people,” says Bejar, “you want to make sure that those connections over time are good and positive and real.”
A daunting mission, but it’s one that Bejar has been leading at Facebook, in collaboration with a team of researchers from Yale University and UC Berkeley, including scientists from the Greater Good Science Center. Together, they’re drawing on insights from neuroscience and psychology to try to make Facebook feel like a safer, more benevolent place for adults and kids alike—and even help users resolve conflicts online that they haven’t been able to tackle offline.
“Essentially, the problem is that Facebook, just like any other social context in everyday life, is a place where people can have conflict,” says Paul Piff, a postdoctoral psychology researcher at UC Berkeley who is working on the project, “and we want to build tools to enable people who use Facebook to interact with each other in a kinder, more compassionate way.”
Facebook as relationship counselor
For users troubled by a photo, Facebook provides the option to click a Report link, which takes them through a sequence of screens where they can elaborate on the problem, called the “reporting flow.”
Up until a few months ago, the flow presented all “reporters” with the same options for resolving the problem, regardless of what that problem actually was; those resolutions included unfriending the user or blocking him or her from ever making contact again on Facebook.
“One thing that we learned is that if you give someone the tool to block, that’s actually not in many cases the right solution because that ends the conversation and doesn’t necessarily resolve anything—you just sort of turn a blind eye to it,” says Jacob Brill, a product manager on Facebook’s Site Integrity and Support Engineering team, which tries to fix problems users are experiencing on the site, from account fraud to offensive content.

facebook
Instead, Brill’s team concluded that a better option would be to facilitate conversations between a person reporting content and the user who uploaded the content, a system that they call “social reporting.”
“I really think that was key—that the best way to resolve conflict on Facebook is not to have Facebook step in, but to give people tools to actually problem-solve themselves,” says Piff. “It’s like going to a relationship counselor to resolve relationship conflict: Relationship counselors are there to give couples tools to resolve conflict with each other.”
To help Facebook develop those tools, Bejar turned to Piff and two of his UC Berkeley colleagues, social psychologist Dacher Keltner and neuroscientist Emiliana Simon-Thomas—the GGSC’s faculty director and science director, respectively—all of whom are experts in the psychology of emotion.
“It felt like we could sharpen their communication,” says Keltner, “just to make it smarter emotionally, giving kids and adults sharper language to report on the complexities of what they were feeling.”
The old reporting flow wasn’t very emotionally intelligent. When first identifying the problem to Facebook, users had some basic options: They could select “I don’t like this photo of me,” claim that the photo was harassing them or a friend, or say that it violated one of the site’s Community Standards—for hate speech or drug use or violence or some other offense. Then they could unfriend or block the other user, or send that user a message.
Initially, users had to craft that message themselves, and only 20 percent of them actually sent a message. To boost that rate, Facebook provided some generic default text—“Hey I don’t like this photo. Please remove it.”—which raised the send rate to 51 percent. But often users would send one of these messages and never hear back, and the photo wouldn’t get deleted.
Bejar, Brill, and others at Facebook thought they could do better. The Berkeley research team believed this flow was missing an important step: the opportunity for users to identify and convey their emotions. That would guard against the fact that it’s easier for people online to be insensitive or even oblivious to how their actions affect others.
“If you get someone to express more productively how they’re feeling, that’s going to allow someone else to better understand those feelings, and try to address their needs,” says Piff. “There are some very simple things we can do to give rise to more productive interpersonal interactions.”

facebook
Instead of simply having users click “I don’t like this photo,” for instance, they decided to prompt users with the sentence “I don’t like this photo because:”, which they could complete with emotion-laden phrases, such as “It’s embarrassing” or “It makes me sad” (see screenshot at left). People reporting photos selected one of these options 78 percent of the time, suggesting that the list of phrases effectively captured what they were feeling.
People were then taken to a screen telling them that the best way to remove the photo was to ask the other user to take it down—blocking or unfriending were no longer presented as options—and they were given more emotionally intelligent text for a message they could send through Facebook, tailored to the particular situation.

facebook
That text included the other person’s name, asked him or her more politely to remove the content (“would you please take it down?” vs. the old “please remove it”), and specified why the user didn’t like the photo, emphasizing their emotional reaction and point of view—but still keeping a light touch. For example, photos that made someone embarrassed are described as “a little embarrassing to me.” (See the screenshot at left for an example.)
It worked. Roughly 75 to 80 percent of people in the new, emotionally intelligent flow sent these default messages without revising or editing the text, a 50 percent increase from the number who sent the old, impersonal message.
When Keltner and his team presented these findings at Facebook’s second Compassion Research Day, a public event held on Facebook’s campus earlier this month, he emphasized that what mattered wasn’t just that more users were sending messages but that they were enjoying a more positive overall experience.
“There are a lot of data that show when I feel stressed out, mortified, or embarrassed by something happening on Facebook, that activates old parts of the brain, like the amygdala,” Keltner told the crowd. “And the minute I put that into words, in precise terms, the prefrontal cortex takes over and quiets the stress-related physiology.”
Preliminary data seem to back this up. Among the users who sent a message through this new flow, roughly half said they felt positively about the other person (called the “content creator”) after they sent him or her the message; less than 20 percent said they felt negatively. (The team is still collecting and analyzing data on how users feel before they send the messages, and on how positively they feel after sending a message through the old flow.)
In this new social reporting system, half of content creators deleted the offending photo after they received the request to remove it, whereas only a third deleted the photo under the old system. Perhaps more importantly, roughly 75 percent of the content creators replied to the messages they received, using new default text that the researchers crafted for them. That’s a nearly 50 percent increase from the number who replied to the old kinds of messages.
“The right resolution isn’t necessarily for the photo to be taken down if in fact it’s really important to the person who uploaded it,” says Brill. “What’s really important is that you guys are talking about that, and that there is a dialogue going back and forth.”
This post is a problem
That’s all well and good for Facebook’s adult users, but kids on Facebook often need more. For them, Facebook’s hazards include cyberbullying from peers and manipulation by adult predators. Rough estimates indicate that more than half of kids have had someone say mean or hurtful things to them online.
Previously, if kids felt hurt or threatened by someone on Facebook, they could click the same Report link adults saw, which took them through a similar flow, asking if they or friends were being “harassed.” From there, Facebook gave them the option to block or unfriend that person and send him or her a message, while also suggesting that they contact an adult who could help.

But after hearing Yale developmental psychologist Marc Brackett speak at the first Compassion Research Day in December of 2011, Bejar and his colleagues realized that the old flows failed to acknowledge the particular emotions that these kids were experiencing. That oversight might have made the kids less likely to engage in the reporting process and contact a supportive adult for guidance.
“The way you really address this,” Bejar said at the second Compassion Research Day, “is not by taking a piece of content away and slapping somebody’s hand, but by creating an environment in which children feel supported.”
To do that, he enlisted Brackett and two of his colleagues, Robin Stern and Andres Richner. The research team organized focus groups with 13-to-14-year-old kids, the youngest age officially allowed on Facebook, and interviewed kids who’d experienced cyberbullying. The team wanted to create tools that were developmentally appropriate to different age ranges, and they decided to target this youngest group first, then work their way up.
From talking with these adolescents, they pinpointed some of the problems with the language Facebook was using. For instance, says Brackett, some kids thought that clicking “Report” meant that the police would be called, and many didn’t feel that “harassed” accurately described what they had been experiencing.
Instead, Brackett and his team replaced “Report” with language that felt more informal: “This post is a problem.”
They tried to apply similar changes across the board, refining language to make it more age-appropriate. Instead of simply asking kids whether they felt harassed, they enabled kids to choose among far more nuanced reasons for reporting content, including that someone “said mean things to me or about me” or “threatened to hurt me” or “posted something that I just don’t like.” They also asked kids to identify how the content made them feel, selecting from a list of options.
Depending on the problem they identified, the new flows gave kids more customized options for the action they could take in response. That included messages they could send to the other person, or to a trusted adult, that featured more emotionally rich and specific language, tailored to the type of situation they were reporting.

“We wanted to make sure that they didn’t feel isolated and alone—that they would receive support in a way that would help them reach out to adults who could provide them with the help that they needed,” Brackett said when presenting his team’s work at the second Compassion Research Day.
After testing these new flows over two months, the team made some noteworthy discoveries. One surprise was that, when kids reported problems that they were experiencing themselves, 53 percent of those problems concerned posts that they “didn’t like,” whereas only three percent of the posts were seen as threatening.
“The big takeaway here is that … a lot of the cases are interpersonal conflicts that are really best resolved either between people or with a trusted adult just giving you a couple of pointers,” Jacob Brill said at the recent Compassion Research Day. “So we’re giving [kids] the language and the resources to help with a situation.”
And those resources do seem to be working: Forty-three percent of kids who used these new flows reached out to a trusted adult when reporting a problem, whereas only 19 percent did so with the old flows.
“The new experience that we’re providing is empowering kids to reach out to someone they trust to get the help that they need,” says Brackett. “There’s nothing more gratifying than being able to help the most amount of kids in the quickest way possible.”
Social reporting 2.0
Everyone involved in the project stresses that it’s still in its very early stages. So far, it has only targeted English-language Facebook users in the United States. Brackett’s team’s work has only focused on 13 to 14 year olds, and the new flows developed by the Berkeley team were only piloted on 50 percent of Facebook users, randomly selected.

www.anyihack.blogspot.com
Can they build a more emotionally intelligent form of social reporting that works for different cultures and age groups?
“Our mission at Facebook is to do just that,” says Brill. “We will continue to figure out how to make this work for anyone who has experiences on Facebook.”
The teams are already working to improve upon the results they presented at the second Compassion Research Day. Brackett says he believes they can encourage even more kids on Facebook to reach out to trusted adults, and he’s eager to start helping older age groups. And he’s excited by the potential impact of this work.
“When we do our work in schools, it’s one district, one school, one classroom at a time,” he says. “Here, we have the opportunity to reach tens of thousands of kids.”
And that reach carries exciting scientific implications for the researchers.
“We’re going to be the ones who get to go in and have 500,000 data points,” says Simon-Thomas. “It’s beyond imagination for a research lab to get that kind of data, and it really taps into the questions we’re interested in: How does conveying your emotion influence social dynamics in rich and interesting ways? Does it facilitate cooperation and understanding?”
And what’s in it for Facebook?
Bejar, the father of two young children, says that protecting kids and strengthening connections between Facebook users makes the site more self-sustaining in the long run. The project will have succeeded, he says, if it encourages more users to think twice before posting a photo that might embarrass a friend, or even to notify that friend when they post a questionable image.
“It’s those kinds of kind, compassionate interactions,” he says, “that help build a sustainable community.”
Source: greatergood.berkeley.edu & Facebook Community.