Tag: New York Times (page 1 of 13)

Unsolicited advice might not be so bad after all?

I’ve followed Tressie McMillan Cottom on Twitter ever since she did a keynote for ALT a few years ago. In this article for The New York Times, she talks about ‘advice culture’.

Cottom is wonderfully forthright in her interactions on Twitter, so I was expecting her to rail against advice culture. Instead, she talks about it as a form of small talk, and (I suppose) a form of necessary social glue.

In the social media era, advice culture feels bigger and more pervasive than ever. After all, what is social media if not the gamification of advice? Every time we post something on Facebook or Twitter or Instagram, we are implicitly asking others to make a judgment of us. And we find ourselves unable to understand why someone would post or share an experience if not to solicit our evaluation. That is what “likes” and comments and “friending” has done to our brains…

Advice culture is so pervasive that it must serve some other function, do something more than assuage insecurities or performing status. Sociologists generally agree that advice is up there with small talk for how it facilitates human connection between strangers. But I recently began thinking advice is no longer a mere subset of small talk but has become our culture’s default common language. Advice is small talk. The decline of social associations like the Rotary Club and the bowling leagues not only weakened our connections to community; it also atrophied our linguistic tool kit.

Source: Why Everyone Is Always Giving Unsolicited Advice | The New York Times

Big Tech companies may change their names but they will not voluntarily change their economics

I based a good deal of Truth, Lies, and Digital Fluency, a talk I gave in NYC in December 2019, on the work of Shoshana Zuboff. Writing in The New York Times, she starts to get a bit more practical as to what we do about surveillance capitalism.

As Zuboff points out, Big Tech didn’t set out to cause the harms it has any more than fossil fuel companies set out to destroy the earth. The problem is that they are following economic incentives. They’ve found a metaphorical goldmine in hoovering up and selling personal data to advertisers.

Legislating for that core issue looks like it could be more fruitful in terms of long-term consequences. Other calls like “breaking up Big Tech” are the equivalent of rearranging the deckchairs on the Titanic.

Democratic societies riven by economic inequality, climate crisis, social exclusion, racism, public health emergency, and weakened institutions have a long climb toward healing. We can’t fix all our problems at once, but we won’t fix any of them, ever, unless we reclaim the sanctity of information integrity and trustworthy communications. The abdication of our information and communication spaces to surveillance capitalism has become the meta-crisis of every republic, because it obstructs solutions to all other crises.

[…]

We can’t rid ourselves of later-stage social harms unless we outlaw their foundational economic causes. This means we move beyond the current focus on downstream issues such as content moderation and policing illegal content. Such “remedies” only treat the symptoms without challenging the illegitimacy of the human data extraction that funds private control over society’s information spaces. Similarly, structural solutions like “breaking up” the tech giants may be valuable in some cases, but they will not affect the underlying economic operations of surveillance capitalism.

Instead, discussions about regulating big tech should focus on the bedrock of surveillance economics: the secret extraction of human data from realms of life once called “private.” Remedies that focus on regulating extraction are content neutral. They do not threaten freedom of expression. Instead, they liberate social discourse and information flows from the “artificial selection” of profit-maximizing commercial operations that favor information corruption over integrity. They restore the sanctity of social communications and individual expression.

No secret extraction means no illegitimate concentrations of knowledge about people. No concentrations of knowledge means no targeting algorithms. No targeting means that corporations can no longer control and curate information flows and social speech or shape human behavior to favor their interests. Regulating extraction would eliminate the surveillance dividend and with it the financial incentives for surveillance.

Source: You Are the Object of Facebook’s Secret Extraction Operation | The New York Times

Leisure is what we do for its own sake. It serves no higher end.

Yes, yes, and yes. I agree wholeheartedly with this view that places human flourishing above work.

To limit work’s negative moral effects on people, we should set harder limits on working hours. Dr. Weeks calls for a six-hour work day with no pay reduction. And we who demand labor from others ought to expect a bit less of people whose jobs grind them down.

In recent years, the public has become more aware of conditions in warehouses and the gig economy. Yet we have relied on inventory pickers and delivery drivers ever more during the pandemic. Maybe compassion can lead us to realize we don’t need instant delivery of everything and that workers bear the often-invisible cost of our cheap meat and oil.

The vision of less work must also encompass more leisure. For a time the pandemic took away countless activities, from dinner parties and concerts to in-person civic meetings and religious worship. Once they can be enjoyed safely, we ought to reclaim them as what life is primarily about, where we are fully ourselves and aspire to transcendence.

Leisure is what we do for its own sake. It serves no higher end.

Source: Returning to the Office and the Future of Work | The New York Times