• Welcome to OGBoards 10.0, keep in mind that we will be making LOTS of changes to smooth out the experience here and make it as close as possible functionally to the old software, but feel free to drop suggestions or requests in the Tech Support subforum!

CT: You know where you got that shirt from, and it damn sure wasn't the men's dept

Status
Not open for further replies.
NFL football is a pretty dumb sport, given that 90% of winning involves just having a decent QB on a non-max deal. Seahawks were great, until they had to pay Wilson. Chiefs were great, until they had to pay Mahomes. Chargers and Cardinals will be good now cause they have the good rookie QB, so can spend money elsewhere. The main reason Brady has all these rings recently is him going the Duncan route and willing to sacrifice salary so his teams can load up around him. Brady just doing the whole pay to win thing for games on your phone.

Mahomes's cap hit this year is $7.4 million. Kyler Murray's is $9.7 million.
 
I either watched a replay of something a bit yesterday or completely imagined the Astros winning game 1 last night
 
Mahomes's cap hit this year is $7.4 million. Kyler Murray's is $9.7 million.

That’s just because Murray was a #1 pick. The Chiefs are signing players for more than just this year. They’re signing players for next year when the cap hit goes to $35.8M and then $46.8M the next year.
 
https://www.yahoo.com/news/story-carol-karen-two-experimental-080010755.html

USA TODAY
The story of Carol and Karen: Two experimental Facebook accounts show how the company helped divide America
Jessica Guynn and Kevin McCoy, USA TODAY
Mon, October 25, 2021, 12:47 PM
In 2019, two users joined Facebook. Both had similar interests: young children and parenting, Christianity, civics and community.

"Carol," 41, was a conservative from North Carolina. She was interested in news, politics, then-President Donald Trump and the nation's first family. She followed the official accounts for Trump, first lady Melania Trump and Fox News.

"Karen" was the same age and lived in the same state. But she was a liberal who liked politics, news and Sens. Bernie Sanders and Elizabeth Warren. She disliked Trump. She followed a local news site, pages about North Carolina and the liberal advocacy group MoveOn.

Facebook's algorithms got to work, suggesting what they'd be interested in.

Accepting recommendations for sites supportive of Trump led Carol to suggestions for a site called “Donald Trump is Jesus,” and another for QAnon, a wide-ranging extremist ideology that alleges celebrities and top Democrats are engaged in a pedophile ring. Karen was presented with anti-Trump pages, including one that posted an image showing an anus instead of Trump's mouth.

The two women were not real. They were created by a Facebook researcher to explore how the social media platform deepened political divides in the U.S. by recommending content rife with misinformation and extremism.

Live updates: More on Facebook papers and the whistleblower testimony in the UK

The experiment shows that Facebook, which had 2.9 billion monthly active users as of June 30, knew before the 2020 presidential election that its automated recommendations amplified misinformation and polarization in the U.S., yet the company largely failed to curtail its role in deepening the political divide.

Reports describing the experiments are among hundreds of documents disclosed to the Securities and Exchange Commission and provided to Congress in redacted form by attorneys for Frances Haugen, a former Facebook employee. The redacted versions were obtained by a consortium of 17 news organizations, including USA TODAY.

More: Facebook says it removes hate speech. Its own research says otherwise.

In the summer of 2019, a Facebook researcher created two fictitious accounts with similar demographics but opposite political beliefs. Facebook's recommendation algorithm quickly suggested the users follow accounts on extreme ends of the political spectrum.
In the summer of 2019, a Facebook researcher created two fictitious accounts with similar demographics but opposite political beliefs. Facebook's recommendation algorithm quickly suggested the users follow accounts on extreme ends of the political spectrum.
Jose Rocha said he's experienced the divisiveness firsthand.

A military veteran who grew up in a Democratic, pro-union family in Selah, Washington, Rocha said Facebook normalized racist views and led him down a rabbit hole to far-right ideologies.

For a time, Rocha said, he became a Nazi sympathizer and a backer of other extremist views – behavior he now blames on Facebook's recommendations system.

"I wouldn't have even known they existed if it wasn't for Facebook. So I wouldn't have went out seeking them," said Rocha, 27.

Bill Navari, 57, a conservative sports commentator from Pittsburgh, said a cousin blocked him on Facebook after he suggested she get her TDS ("Trump derangement syndrome") checked.

“I’ve seen people on Facebook saying, ‘If you are voting for Trump, unfriend me.' But I didn’t see anyone saying, ‘If you are voting for Biden, unfriend me,'” he said. “Facebook has become like oil and water, and never the two shall meet.”

These days, he steers clear of political debates on Facebook.

“I’ll post pics of my family, of my dog, where we went on vacation, and I stay in touch with the friends and family. But posting a meme or putting something on Facebook, it’s not going to change anyone’s mind,” he said. “I just think the conversation has become so coarse.”

Is Facebook to blame? “I don’t like pointing fingers without direct knowledge,” he said. “But I do think that Facebook is a party to this.”

The internal Facebook documents show how swiftly the platform's recommendation algorithms can amplify polarization by sending users to content full of misinformation and extremism.

The company's experiment with the hypothetical conservative user was called "Carol's Journey to QAnon." Within five days of going live on June 2, 2019, the user was barraged by "extreme, conspiratorial and graphic content," the researcher wrote.

One of the recommendations included an image labeling former President Barack Obama a "traitor" with a caption that read, "When we're done he'll claim Kenyan citizenship as a way to escape." (Despite racist claims to the contrary, Obama is a U.S. citizen.)

The report on the fictitious liberal user was called "Karen and the Echo Chamber of Reshares." That account went live on July 20, 2019. Within a week, Facebook's recommendations pivoted to "all anti-Trump content." Some recommendations came from a small Facebook group that had been flagged for "promoting illegal activity," the Facebook researcher wrote.

One image served to Karen showed then-first lady Melania Trump's face superimposed on the body of a bikini-clad woman kneeling on a bed. The caption read, "Melania Trump: Giving evangelicals something they can get behind."

Facebook whistleblower Frances Haugen appears before the Senate Commerce, Science, and Transportation Subcommittee at the Russell Senate Office Building on October 05, 2021, in Washington, D.C. Haugen left Facebook in May and provided internal company documents about Facebook to journalists and others, alleging that Facebook consistently chooses profits over safety. (Photo by Matt McClain-Pool/Getty Images)
Facebook whistleblower Frances Haugen appears before the Senate Commerce, Science, and Transportation Subcommittee at the Russell Senate Office Building on October 05, 2021, in Washington, D.C. Haugen left Facebook in May and provided internal company documents about Facebook to journalists and others, alleging that Facebook consistently chooses profits over safety. (Photo by Matt McClain-Pool/Getty Images)
More
Haugen, the former Facebook employee who has blown the whistle on the company, is a former product manager who worked on Facebook’s Civic Integrity team, focusing on elections. She had a front-row seat to the most divisive political events in recent memory, including the Jan. 6 insurrection in which Trump supporters tried to block Congress from certifying Joe Biden's win in the presidential election.

Concerned that Facebook was prioritizing profits over the well-being of its users, Haugen reviewed thousands of documents over several weeks before leaving the company in May.

The documents, some of which have been the subject of extensive reporting by The Wall Street Journal and CBS News' "60 Minutes," detail company research showing that toxic and divisive content is prevalent in posts boosted by Facebook and shared widely by users.

"I saw Facebook repeatedly encounter conflicts between its own profits and our safety. Facebook consistently resolves these conflicts in favor of its own profits," Haugen alleged during a Senate hearing this month. "The result has been more division, more harm, more lies, more threats and more combat."

Haugen has called on Facebook to limit its practice of prioritizing content that has drawn shares and comments from many users.

She has sought federal whistleblower protection from the SEC, alleging that Facebook, a publicly traded company, misled investors. She could get a financial award if the SEC were to penalize the company.

In this file photo illustration, a smartphone displays the logo of Facebook on a Facebook website background, on April 7, 2021, in Arlington, Virginia. Facebook's independent Oversight Board announced on April 13, 2021, it would start accepting requests to remove "harmful content" that users believe has been wrongly allowed to remain on the leading social network. The move broadens the mandate of the so-called "supreme court" of Facebook, which up to now had been tasked with reviewing instances of whether content was improperly taken down from Facebook or Instagram.
In this file photo illustration, a smartphone displays the logo of Facebook on a Facebook website background, on April 7, 2021, in Arlington, Virginia. Facebook's independent Oversight Board announced on April 13, 2021, it would start accepting requests to remove "harmful content" that users believe has been wrongly allowed to remain on the leading social network. The move broadens the mandate of the so-called "supreme court" of Facebook, which up to now had been tasked with reviewing instances of whether content was improperly taken down from Facebook or Instagram.
More
Facebook denies that it is the cause of political divisions in the U.S.

“The rise of polarization has been the subject of serious academic research in recent years but without a great deal of consensus," said spokesman Andy Stone. "But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization."

Facebook cited a research study that showed polarization has declined in a number of countries with high social media use even as it has risen in the U.S.

As for the test accounts, Stone said the experiment was "a perfect example of research the company does to improve our systems and helped inform our decision to remove QAnon from the platform."

Facebook while black: Users call it getting 'Zucked,' say talking about racism is censored as hate speech

Facebook tweaks its algorithms to increase engagement
After Russia used Facebook to interfere in the 2016 presidential election, pressure built on the company and its CEO, Mark Zuckerberg, to do something about misinformation and divisive content.

Meanwhile, critics charged that the company's apps exploited human psychology to hook people on social media, hijacking their time and undermining their well-being.

Facebook and Instagram ads linked to Russia during the 2016 election.
Facebook and Instagram ads linked to Russia during the 2016 election.
Especially worrying to company leaders was that users were less engaged on the platform. They scrolled through updates on their timelines, reading articles and watching videos. But they commented and shared posts less than before.

In response, Facebook radically altered the algorithm that determines what to display at the top of users' News Feed, the stream of posts from friends, family, groups and pages. The change was aimed at bringing users more updates from friends and family that spark meaningful social exchanges, the company said at the time.

But the focus on posts with high numbers of comments and likes rewarded outrage and resulted in the spread of more misinformation and divisive content, according to internal documents reviewed by USA TODAY. The more negative or incendiary the post, the further and faster it spread.

The change was noticeable.

Kent Dodds, a software engineer from Utah, said he rarely uses Facebook. In September 2019, he hopped on to voice his support for then-Democratic presidential candidate Andrew Yang.

Soon Dodds' News Feed shifted. Instead of seeing posts from his social circle, he was bombarded by political posts from distant Facebook connections.

“I remember coming away from that thinking, Facebook wants me to fight. They really want me to engage with these friends I haven’t talked to in a long time about their very different political views, and clearly not in a positive way,” Dodds said.

“Whether or not Facebook is intentional about what their algorithm is doing, it is responsible for it, and it’s doing harm to our society and they should be held accountable,” he said.

This has absolutely happened to me. Just feeds me info from rednecks I went to HS with that I haven't thought about for 20 years. A couple ended up blocking me.
 
That’s just because Murray was a #1 pick. The Chiefs are signing players for more than just this year. They’re signing players for next year when the cap hit goes to $35.8M and then $46.8M the next year.

Yes, they signed a guard to the largest-ever contract for an interior lineman this offseason, handed out a $5.5 million contract to an IDL, and brought in a Pro Bowl tackle, all despite the salary cap dropping drastically from last year. Not exactly the moves of a team hamstrung by cap constraints.
 
Those articles weirdly portray Facebook as a passive observer of their own algorithms. “Carol” and “Karen” were tests to see if the algorithms worked as intended. They did, so Facebook approved.

These stories treat division and social upheaval as unfortunate outcomes of Facebook algorithms. I don’t think that’s the case. I think Facebook correctly sees actual face to face human connection as competition. If people are hanging out with real life family and friends, they’re spending less time on Facebook. With the pandemic and an election, it was a perfect time to divide people and get them to become more loyal to Facebook friends than real life connections.
 
And Russ "got paid" in 2015, so if "until they had to pay" actually means "the year before they had to pay," that would include the Seahawks' 2014 Super Bowl appearance in the "not being great" category.

Obviously, paying sub-market value for a QB is a tremendous advantage. Also obviously, the Chiefs are paying well below market to Mahomes this year and his extension is not the cause of their early-season struggles. It's Plama-level analysis at its finest.
 
And Russ "got paid" in 2015, so if "until they had to pay" actually means "the year before they had to pay," that would include the Seahawks' 2014 Super Bowl appearance in the "not being great" category.

Obviously, paying sub-market value for a QB is a tremendous advantage. Also obviously, the Chiefs are paying well below market to Mahomes this year and his extension is not the cause of their early-season struggles. It's Plama-level analysis at its finest.

Ok, so the chiefs are gonna struggle in 2 years but its not the cause of their demise this year (a quick google search implies its poor recent drafting & free agent signings). Still doesn't change the fact it kinda sucks to see and get invested in someone, only to have the math give them a 4-5 year window, tops. Like Allen, Burrow, Murray are gonna be great for a few years, then they'll likely never get past the championship round again. What fun is it watching Rodgers, the best QB of his generation, play with relatively little talent around him for a decade?
 
A large part of me wishes I could express myself as if I were Logan Roy and not be a people pleaser. Fuck off.
 
I drink 2-4 nights a week. During the week, it's mostly one drink... Either a drink with the wife while we make dinner or something after the kids go to bed and I'm relaxing on the couch.

Might have 2+ on Friday or Saturday, depending on what we're doing. Anticipating a healthy (or unhealthy) amount of drinking this Saturday.
 
I have fallen into just day-drinking on Saturday and Sundays, usually just 2-3 drinks. Sometimes a Friday night if I'm not getting up to go run or hike at sunrise on Saturday. My early am stuff normally keeps me from a weeknight beer. That will probably change once basketball starts and self medication is the general practice.
 
Status
Not open for further replies.
Back
Top