How disinformation is reshaping political campaigns


How disinformation is reshaping political campaigns – CBS News

Watch CBS News


The rampant growth of disinformation is creating an ever-evolving problem for politicians. A new book called “The Lie Detectives” seeks to understand the players fighting against the issue, and what they’re trying to teach political campaigns. Author Sasha Issenberg joins CBS News to explain.

Be the first to know

Get browser notifications for breaking news, live events, and exclusive reporting.




Source link

Peru dog inspires Facebook community


Mar. 25—”Just this side of heaven is a place called Rainbow Bridge.”

For those not familiar with the poem, the rest of the words paint a picture of a peaceful place where pets that have passed are restored to health and watch over their owners on earth.

And if Rainbow Bridge had a spokesman, or rather a spokesdog, it’d likely be Max.

Peru resident Joe DeRozier remembers the first time he ever met Max, the floppy-eared Maltese with a big-dog attitude.

“My wife (Kathy) and I got married 13 years ago, and I had just lost my last dog a few years before that,” he said. “And she had a dog (Max) because she wanted a dog in the house. … So when I met Kathy, and it ended up being serious, it was a package deal. I got the dog too.”

It didn’t take long for Max to endear himself to DeRozier either, or for DeRozier to do the same with Max.

“I was his play buddy,” DeRozier said. “I was the one that was going to play with him and get him to do stuff. And of course, I was attached to him just like that, I mean, how could you not be?”

And that bond grew even deeper after Max’s accident in late 2022, DeRozier explained.

“We have a pond by our house,” he said. “Max always went out the back door and down the steps and over to use the bathroom while I’d be getting his meal ready. Then he’d just run back up the stairs and into the house again. But on that day when I looked back after fixing his meal, he wasn’t inside.”

He wasn’t on the outside steps either.

“So then I went to the porch, and I saw him running along the water,” DeRozier said. “He didn’t see well enough to be running along the water like that. So I went back inside to grab my shoes. And now when I think about it, I should have just taken off running. But I grabbed my shoes. And when I went back outside, he was gone.”

DeRozier then took off running toward the pond’s edge.

A few moments later, DeRozier spotted Max below the surface of the water.

It took awhile to pull Max out of the pond that day, and DeRozier said many vets told him it would be best to put Max down after that incident because he was likely “brain dead.”

But the dog pulled through, DeRozier said smiling.

And within two weeks, he was the same old Max.

It was also around that time DeRozier, who has written eight books, decided to create a Facebook group detailing Max’s recovery and his everyday life.

He titled the page, “Max-Man,” and it quickly garnered attention online.

But what made the page extra unique was everything told was from Max’s perspective.

Visitors to the page were able to hear Max talk about his interactions with his family and friends, and they were even able to wish the dog a happy 16{sup}th{/sup} birthday.

But then came November 2023.

“Hi everyone,” a post read on Nov. 28. “It’s me, your puppy Max. Thank you so much for all of the love you’ve given me. I love you too. I have to go now. I’m sorry. Don’t forget me, OK? I won’t ever forget you. Love always, your puppy Max.”

“Max had just gotten so bad,” DeRozier told the Tribune, remembering those days of confusion and fear last November. “He had pancreatitis. He had liver issues. He had dementia. He was on three different medications all the time because when he was recovering from the drowning, he started to suffer from canine cognitive disfunction (CCD). He also had been having strokes.

“And Max was such a mess,” DeRozier added. “He hadn’t slept for two days, and I hadn’t slept for two days because I was up with him.”

So DeRozier and his wife made the painful decision to have Max euthanized.

And that’s when DeRozier said Max’s ultimate mission began.

“After he died, I wrote a post from Max’s point of view obviously,” he said. “I told everyone how much I (writing as Max) loved them and would miss them. And then people responded. But they didn’t say, ‘Oh, I’m so sorry, Joe.’ They were saying, ‘Oh, Max, we love you and will miss you so much.'”

So after about a week, DeRozier said he had an idea.

“I just went ahead and made Max the spokesperson for Rainbow Bridge,” he said, smiling through his tears. “And people love it. So he’s at the bridge now, and he’s happy. He’s making new friends. Because that’s the thing. There’s a lot of people on there whose pets have passed, and everybody always wants to know that their pet is OK.”

DeRozier said the posts have become his therapy, but he also hopes they are also therapy for those who read them.

Because visitors to the site aren’t just reading Max’s adventures.

It’s become a completely interactive experience for the 1,300 followers to the group.

“People write on there and ask Max to find their own pets up there at the Rainbow Bridge,” DeRozier said. “And they aren’t talking to me. They’re talking directly to Max. They want Max to be the messenger.”

So he is.

And in DeRozier’s weekly posts, Max and those other animals enjoy big meals together, movie dates and playing endless games of fetch.

But their favorite activity, DeRozier believes, are the visits.

“Max writes that every night before they fall asleep, they all talk about their mommies and daddies and how they each had the best ones,” he said. “And then as a pack, they all come down and visit their mommies and daddies and let them know they’re all right.”

“Yes, I know people know it’s not real, but they still need it because it makes them feel better,” DeRozier added. “It makes them feel like it’s OK to be sad. It’s OK to grieve. It’s OK to miss them. We’re all going through something together. There are people that write to Max and say they lost their pets 10 years ago, and they still think about them all the time. Then they want Max to go find them and tell them that.”

And perhaps that’s the true success behind the “Max-Man” page, DeRozier confessed.

People just want to know that their beloved pet will never be forgotten.

“In my mind, Max was given a job by God to do this,” DeRozier said. “That’s how I think. That’s how I have to think. If I were to think about it another way, I’d have to say that he was gone. And this way, he’s not really gone. He’s still here, and he’s doing great things. Whoever created the Rainbow Bridge did (that) so we can all feel better. And this is just a continuation of that. This is just giving us messages from there.”



Source link

Threads loses half its users after massive surge at launch


Threads loses half its users after massive surge at launch – CBS News

Watch CBS News


Mark Zuckerberg’s new app, Threads, hasn’t been able to maintain its explosive debut, losing half its users since launch. In other Meta news, the company has announced a new game for its VR world. Alexander Konrad, the senior editor of Forbes magazine, joined CBS News to talk about it all.

Be the first to know

Get browser notifications for breaking news, live events, and exclusive reporting.




Source link

Facebook opened its doors to researchers. What they found paints a complicated picture of social media and echo chambers.


A landmark study of how Facebook shaped the news users saw in the run-up to the 2020 election has found the platform resulted in “significant ideological segregation” in regard to political news exposure — specifically with conservative users who researchers found were more walled off and encountered far more misinformation than their liberal counterparts. 

Looking at aggregated data from 208 million U.S. users, researchers found “untrustworthy” news sources were favored by conservative audiences and almost all (97%) of the political news webpages rated as false by Meta’s third-party fact-checkers were seen by more conservatives than liberals.

Overall, the research, led by Sandra González-Bailón, a professor at the University of Pennsylvania’s Annenberg School for Communication, found that Facebook’s pages and groups, more than users’ friends, contributed more to this ideological segregation and polarization. In general, it concluded that conservative sources dominated Facebook’s news ecosystem. 

It was not clear whether this segregation was caused more by algorithms or user choice. 

“These feedback loops are very difficult to disentangle with observational data,” González-Bailón said. “These require more research.”

The study is one of four published Thursday, three in the journal Science and one in Nature. They were part of an unprecedented partnership between a group of prominent independent academic researchers and researchers at Meta with the aim of studying the impact Facebook and Instagram had on U.S. users during the 2020 elections. 

The project included 17 academic researchers from 12 universities who were granted deep access by Facebook to aggregated data. The researchers collaborated with more than two dozen researchers, engineers and legal analysts at Meta. 

The independent researchers were not paid by Meta, and the social media company agreed not to reject research questions for anything outside privacy or logistical reasons. Meta also relinquished the right to restrict or censor the researchers’ final findings. In the interest of transparency, the collaboration was monitored by an independent rapporteur, whose report was also published in Science on Thursday. 

Together the studies offer the deepest look yet at how news flowed across Facebook and a more limited idea of how that news may or may not have affected political polarization.

But the research is also limited in its scope. The algorithm experiments were conducted on only two platforms, Facebook and Instagram, over three months, a relatively short amount of time at the height of a contentious presidential election. 

Political content has traditionally been only a small part of what Facebook users see, and the platform has since sought to reduce how much political content is shown to users. In 2021, the company said it was doing initial tests on reducing political content in its News Feed, culminating in an update in April in which the company said it was continuing to refine its approach to such content and moving away from rankings based on engagement. Posts linking to political news webpages amounted to around 3% of all posts shared on Facebook, according to the González-Bailón study. 

Each of the four studies found Meta’s recommendation algorithms — the complicated rules and rankings behind how platforms feed content and communities to their users — to be extremely influential in deciding what those users see and how they interact with content. 

Three out of the four studies experimented with the algorithm and concluded that the kind of tweaks long hypothesized to be the solutions to polarization and the key to healthier online experiences may not affect people’s political attitudes and real-world behaviors, at least in the short term. Such tweaks include reverting to chronological feeds, reducing virality by limiting reshared content or breaking up echo chambers.

“These findings should give all of us pause, including policymakers, about any simple sort of solution,” said Talia Stroud, a professor at the University of Texas at Austin, who helped lead the research project. 

For the three experimental studies, paid participants allowed researchers to manipulate their experience on the platforms in some way. They used the platforms as usual, completed surveys on political attitudes and activities throughout the three-month period, and shared their online activity on and off the studied platforms.  

In one study, led by Andrew Guess, an assistant professor of politics and public affairs at Princeton University, researchers randomly assigned participants a reverse chronological feed on Facebook and Instagram, showing newest posts first without any other algorithmic weighting. 

In 2021, Facebook whistleblower Frances Haugen and some lawmakers suggested a time-ordered feed could fix the myriad problems that come with recommendation algorithms, which critics argue are engineered to keep users engaged and enraged. The next year, Facebook rolled out customizable feeds, though it’s unclear how many people utilize these options. 

The new study doesn’t breed hope for chronological feeds as a silver bullet, and some of the findings can appear contradictory. Facebook users who saw the newest posts first encountered more political and untrustworthy content (by more than two-thirds), but less “uncivil” content (by almost half). At the same time, they also were shown more posts from their moderate friends and from ideologically-mixed groups and pages. 

One significant effect: Without the sophisticated algorithm, researchers reported that users liked, commented and shared less often and spent “dramatically less” time on Facebook and Instagram overall. Instead, mobile users who had been switched to reverse chronological feed spent more time on TikTok and YouTube. Desktop users spent less time on Facebook and more time on Reddit. 

Despite the effects on user experience, changing to a chronological feed didn’t affect participants’ levels of self-reported political polarization, knowledge or attitudes. 

In a second study, the researchers experimented with virality, cutting off some participants from the ability to see content reshared from friends, groups or pages. Turning off what amounts to about one-quarter of posts viewed on Facebook had a measurable effect. Users saw less political news, clicked less on partisan news and were exposed to fewer posts containing untrustworthy content. 

This seems to support the common belief, noted in leaked internal Facebook research reports, that emotionally-charged and political content gets reshared more often. Still, as they had with the chronological feed, researchers couldn’t find any link to a shift in users’ political attitudes or behavior. 

The third experiment investigated echo chambers to find out what happens when people see less content from like-minded groups and pages? 

First, the research confirmed that echo chambers are real — the majority of content users saw came from groups and friends who shared political leanings. Just over half came from like-minded sources and just under 15% came from people or groups with different political leanings. As with the other experiments, reducing content from like-minded sources while increasing exposure to people and content from other points of view had no real effect on polarization or political preferences or opinions as measured by the study.

Limitations notwithstanding, Nick Clegg, president of Global Affairs at Meta, trumpeted the findings as an exoneration of Facebook and its role in politics and elections. He wrote in a blog post that the papers are the first time that the company has opened itself to academics in this way, and that they showed Facebook had no role in the toxicity of U.S. politics.

“These findings add to a growing body of research showing there is little evidence that social media causes harmful ‘affective’ polarization or has any meaningful impact on key political attitudes, beliefs or behaviors,” he wrote. “They also challenge the now commonplace assertion that the ability to reshare content on social media drives polarization.”

The researchers behind the new studies were more restrained. 

“These findings don’t mean there aren’t reasons for concern about social media,” said Brendan Nyhan, a professor in the department of government at Dartmouth College and one of the lead authors behind the echo chambers study. “But the study is important in that it challenges some notions people have about the effects of social media. And it might help reorient that conversation.”

The researchers said the collaborative new studies underscore the need for tech companies to provide greater access to data. 

“What we were able to do here — unpacking the sort of black box of algorithms, providing all these kinds of details about what’s happening on these platforms — is a huge illustration of the value of making sure that platforms make data available to external researchers,” said Joshua Tucker, project lead and co-director of the NYU Center for Social Media and Politics.

Still, collaborations with platforms may not be the model for research going forward and perhaps it shouldn’t be, according to Michael W. Wagner, professor in the University of Wisconsin-Madison’s School of Journalism and Mass Communication, who served as the collaboration’s independent rapporteur. 

In an article about the project for Science, Wagner wrote the researchers had conducted “rigorous, carefully checked, transparent, ethical, and path-breaking studies.” But future scholarship should not depend, he wrote, on obtaining a social media company’s permission.

Additional studies from the project, currently in the peer-review process, are expected in the coming months.



Source link

Less than a month left to apply for Facebook settlement money


Less than a month left to apply for Facebook settlement money – CBS News

Watch CBS News


Facebook users have until Aug. 25 to get their share of the $725 million settlement over the social network’s privacy violations. Takendra Parmar, tech features editor for Insider, joined CBS News to talk about the reasoning behind the settlement and expectations for individual payouts.

Be the first to know

Get browser notifications for breaking news, live events, and exclusive reporting.




Source link