Alterslash

the unofficial Slashdot digest
 

Contents

  1. Quadratic Gravity Theory Reshapes Quantum View of Big Bang
  2. Scientists Shocked To Find Lab Gloves May Be Skewing Microplastics Data
  3. AI Data Centers Can Warm Surrounding Areas By Up To 9.1C
  4. Microsoft Plans To Build 100% Native Apps For Windows 11
  5. After 16 Years and $8 Billion, the Military’s New GPS Software Still Doesn’t Work
  6. Samsung Is Bringing AirDrop-Style Sharing to Older Galaxy Devices
  7. OkCupid Settles FTC Case On Alleged Misuse of Its Users’ Personal Data
  8. Life With AI Causing Human Brain ‘Fry’
  9. Judge Allows BitTorrent Seeding Claims Against Meta, Despite Lawyers ‘Lame Excuses’
  10. Microsoft Copilot Is Now Injecting Ads Into Pull Requests On GitHub
  11. Sony Shuts Down Nearly Its Entire Memory Card Business Due To SSD Shortage
  12. Tech CEOs Suddenly Love Blaming AI For Mass Job Cuts
  13. New Company Hopes to Build Age-Verification Tech into Vape Cartridges
  14. Apple’s Early Days: Massive Oral History Shares Stories About Young Wozniak and Jobs
  15. Rivian and Lucid Win Right to Sell Their EVs Directly to Buyers in Washington State

Alterslash picks up to the best 5 comments from each of the day’s Slashdot stories, and presents them on a single page for easy reading.

Quadratic Gravity Theory Reshapes Quantum View of Big Bang

Posted by BeauHD View on SlashDot Skip
Researchers at the University of Waterloo say a new “quadratic quantum gravity” framework could explain the universe’s rapid early expansion without adding extra ingredients to Einstein’s theory by hand. The idea is especially notable because it makes testable predictions, including a minimum level of primordial gravitational waves that future experiments may be able to detect. “Even though this model deals with incredibly high energies, it leads to clear predictions that today’s experiments can actually look for,” said Dr. Niayesh Afshordi, professor of physics and astronomy at the University of Waterloo and Perimeter Institute (PI). “That direct link between quantum gravity and real data is rare and exciting.” Phys.org reports:
The research team found that the Big Bang’s rapid early expansion can emerge naturally from this simple, consistent theory of quantum gravity, without adding any extra ingredients. This early burst of expansion, often called inflation, is a central idea in modern cosmology because it explains why the universe looks the way it does today.

Their model also predicts a minimum amount of primordial gravitational waves, which are tiny ripples in spacetime geometry created in the first moments after the Big Bang. These signals may be detectable in upcoming experiments, offering a rare chance to test ideas about the universe’s quantum origins.

[…] The team plans to refine their predictions for upcoming experiments to explore how their framework connects to particle physics and other puzzles about the early universe. Their long-term goal is to strengthen the bridge between quantum gravity and observational cosmology.
The research has been published in the journal Physical Review Letters.

Get Off My Lawn

By Spinlock_1977 • Score: 3 Thread

Don’t you young whipper-snappers go gettin’ yer new-fangled quadratic quantum gravity up in my 11-dimensional string theory!

Scientists Shocked To Find Lab Gloves May Be Skewing Microplastics Data

Posted by BeauHD View on SlashDot Skip
Researchers found that common nitrile and latex lab gloves can shed stearate particles that closely resemble microplastics, potentially “increasing the risk of false positives when studying microplastic pollution,” reports ScienceDaily.

“We may be overestimating microplastics, but there should be none,” said Anne McNeil, senior author of the study and U-M professor of chemistry, macromolecular science and engineering. “There’s still a lot out there, and that’s the problem.” From the report:
Researchers found that these gloves can unintentionally transfer particles onto lab tools used to analyze air, water, and other environmental samples. The contamination comes from stearates, which are not plastics but can closely resemble them during testing. Because of this, scientists may be detecting particles that are not true microplastics. To reduce this issue, U-M researchers Madeline Clough and Anne McNeil recommend using cleanroom gloves, which release far fewer particles.

Stearates are salt-based, soap-like substances added to disposable gloves to help them separate easily from molds during manufacturing. However, their chemical similarity to certain plastics makes them difficult to distinguish in lab analyses, increasing the risk of false positives when studying microplastic pollution.
“For microplastics researchers who have these impacted datasets, there’s still hope to recover them and find a true quantity of microplastics,” said researcher and recent doctoral graduate Madeline Clough. “This field is very challenging to work in because there’s plastic everywhere,” McNeil said. “But that’s why we need chemists and people who understand chemical structure to be working in this field.”
The findings have been published in the journal Analytical Methods.

Re:Latex schmubs

By bn-7bc • Score: 5, Informative Thread
Look they don’t say thay the microplastics are all from their gloves, they just say that their estimate might have been skewed by the gloves ie “plz be aware the tour result maey need more sctutany” . No need to trow all their results out at ones

To summarise then

By EldoranDark • Score: 3 Thread
We still know microplastics are bad because deliberately introducing them to lab animals produces all sorts of bad stuff. Our samples might be skewed towards showing more of them in more places than there actually are, so perhaps things are not quite as bad yet. On the other hand, the same bias could mean that micro plastics becomes a problem at smaller quantities than previously thought.

AI Data Centers Can Warm Surrounding Areas By Up To 9.1C

Posted by BeauHD View on SlashDot Skip
An anonymous reader quotes a report from New Scientist:
Andrea Marinoni at the University of Cambridge, UK, and his colleagues saw that the amount of energy needed to run a data centre had been steadily increasing of late and was likely to “explode” in the coming years, so wanted to quantify the impact. The researchers took satellite measurements of land surface temperatures over the past 20 years and cross-referenced them against the geographical coordinates of more than 8400 AI data centers. Recognizing that surface temperature could be affected by other factors, the researchers chose to focus their investigation on data centers located away from densely populated areas.

They discovered that land surface temperatures increased by an average of 2C (3.6F) in the months after an AI data center started operations. In the most extreme cases, the increase in temperature was 9.1C (16.4F). The effect wasn’t limited to the immediate surroundings of the data centers: the team found increased temperatures up to 10 kilometers away. Seven kilometers away, there was only a 30 percent reduction in the intensity. “The results we had were quite surprising,” says Marinoni. “This could become a huge problem.”

Using population data, the researchers estimate that more than 340 million people live within 10 kilometers of data centers, so live in a place that is warmer than it would be if the data centre hadn’t been built there. Marinoni says that areas including the Bajio region in Mexico and the Aragon province in Spain saw a 2C (3.6F) temperature increase in the 20 years between 2004 and 2024 that couldn’t otherwise be explained.
University of Bristol researcher Chris Preist said the findings may be more complicated than they look. “It would be worth doing follow-up research to understand to what extent it’s the heat generated from computation versus the heat generated from the building itself,” he says. For example, the building being heated by sunlight may be part of the effect.
The findings of the study, which has not yet been peer-reviewed, can be found on arXiv.

Farm pasture versus concrete buildings?

By will4 • Score: 5, Insightful Thread

Are they comparing farm pasture temperature readings versus temperature readings of concrete buildings and paved parking lots?

Re:I dunno understand.

By ebunga • Score: 5, Funny Thread

Most data centers don’t run windows. They run systemd.

7 KM away

By will4 • Score: 5, Insightful Thread

> the team found increased temperatures up to 10 kilometers away. Seven kilometers away, there was only a 30 percent reduction in the intensity. "

This entire research should be about comparing data center produced heat on-site and nearby spill over to other high energy use/high heat production industries such as steel mills.

Re:Farm pasture versus concrete buildings?

By maladroit • Score: 5, Interesting Thread

From page 5:

“These results are dramatically impressive, especially considering that the typical LST increase caused by the quintessential example of compound of anthropogenic activities - the urban heat island effect - has been estimated in the 4 and 6 [degrees] C interval. This apparent step function emphasize the clear effect of AI hyperscalers on their surrounding areas, so much that it can match the impact of “islands” of higher temperatures: therefore, we call this the data heat island effect.”

Re:Farm pasture versus concrete buildings?

By maladroit • Score: 4, Informative Thread

From the summary:

“Recognizing that surface temperature could be affected by other factors, the researchers chose to focus their investigation on data centers located away from densely populated areas.”

You’ll have to look at the study data to see if that completely addresses your concern, but unsurprisingly the professional researchers have put some thought into what controls a study like this might need.

Microsoft Plans To Build 100% Native Apps For Windows 11

Posted by BeauHD View on SlashDot Skip
Microsoft is reportedly shifting Windows 11 app development back toward fully native apps. Rudy Huyn, a Partner Architect at Microsoft working on the Store and File Explorer, said in a post on X that he is building a new team to work on Windows apps. “You don’t need prior experience with the platform.. what matters most is strong product thinking and a deep focus on the customer,” he wrote. “If you’ve built great apps on any platform and care about crafting meaningful user experiences, I’d love to hear from you.” Huyn later said in a reply on X that the new Windows 11 apps will be “100% native.” TechSpot reports:
The description stands out at a time when many of Microsoft’s built-in tools, including Clipchamp and Copilot, rely on web technologies and Progressive Web App architectures. The company’s commitment to native performance suggests that some long-standing frustrations around responsiveness, memory use, and interface consistency could finally be addressed.

For Windows developers, Huyn’s comments hint at a change in direction. Microsoft’s recent development priorities have leaned heavily on web-based approaches, with Progressive Web Apps (PWAs) replacing or supplementing many native programs. […] Exactly which applications will be rebuilt, or how strictly “100% native” will be enforced, remains unclear. Some current Microsoft apps classified as native still depend on WebView for specific features. But the renewed emphasis already has developers paying attention.

Re:They don’t want to make other OSes more attract

By karmawarrior • Score: 5, Informative Thread

They’re not. Electron apps are not accessed via a browser. While it’s true you can easily port an Electron app to GNU/Linux, that’s also true of a .NET app (which, let’s be honest, is likely what they’re talking about here, I doubt they’re going back to C++ for everything.)

The real advantage of Electron is you can use most of the same code and assets for a website as for an Electron application, which is useful, but given how ridiculously inefficient Electron is, that isn’t much of a justification for using it. Over the last 15 years, most desktop operating system’s UIs have been debased by increasingly inconsistent designs making them harder to use, and a huge amount of that has been designing for some superficial “web” design that doesn’t really exist - at least, not in a form that stands still.

My sense of this:

Microsoft is in a panic. Almost everything different between Windows 10 and Windows 11 is disliked, from the centralized logins to the AI-with-everything. On top of this RAM prices are sky high meaning the bloat is rapidly becoming a problem. What they’ve realized is they have to do a full overhaul of Windows 11. And one of these is to stop using technologies like Electron where they shouldn’t be used. They can literally reduce its memory footprint to Windows 7 levels, and make their code more reliable and less dependent on third party libraries and APIs by eliminating a rather absurd example of abstraction-for-abstraction’s sake from their development stack.

This might even be good news.

Finally?

By TheDarkMaster • Score: 5, Insightful Thread
Finally? I’m tired of seeing apps in Windows 11 that are an integral part of the operating system and should therefore be native, but were built with that total, complete, and absolute shit that is “web apps”. “Web apps” only make sense when you really need independence from the OS to the point of accepting a loss of performance and very bad resource usage. Web apps have absolutely no place on where they would never be used on another operating system.

Re:They don’t want to make other OSes more attract

By TheDarkMaster • Score: 5, Informative Thread
Just a few years ago, an app with almost the same functionality as WhatsApp (though it wouldn’t have video or audio, since that wasn’t feasible back then on dial-up or DSL connections) wouldn’t have used more than 50MB even under heavy use. Nowadays, however, an app with the same goals easily exceeds 1.5GB of RAM.

1.5 GB of RAM for an instant messaging app. It was possible to run the entire Windows XP system plus user applications on 128MB of RAM… 256MB was a luxury.

And for those complete idiots who keep going on and on about how “memory that isn’t used is wasted memory,” I have two things to say to those clowns:

1) There is absolutely no reason to use 1GB of RAM for a task that you can easily handle with just 10MB of RAM. Just because your computer has 32GB of RAM doesn’t mean you have to use all of it just for your application;

2) Your application isn’t the only thing running on the user’s computer. What happens if the dozens of processes running on the user’s computer all have the same idiotic idea of trying to reserve all the computer’s memory for themselves?

Re: the only reason

By liqu1d • Score: 5, Informative Thread
You’re on iOS right? Disable “smart quotes” in keyboard settings will fix it. One day slashdot might vibe code some fixes and join the modern web… or they might not.

Re:developer market share

By DrXym • Score: 5, Insightful Thread
I’ve programmed Win32 for decades and while it was fine for the time, much of the user facing APIs are obsolete for modern GUI development and some of the non-user facing stuff too. But Microsoft really hasn’t produced a credible replacement for it and has shat out a succession of technologies one after the other that devs are supposed to use before Microsoft abandons them for the next - Win32 (and layers on top like MFC), WinForms, WPF (and Silverlight), UWP, Windows App SDK / WinUI.

Some of these technologies are overlapping, but each was intended to coral devs into making Metro apps or Windows Store apps and burn their bridges in the process. It went down like a lead balloon. Now they’re dialing back trying to make WinUI somewhat platform agnostic to the version of Windows its running on but who knows if it will stick. It’s not the only pain point because Microsoft even extended the C++ language to deal with these APIs with new types like “ref”, “partial” and hat notation to deal with garbage collected objects, auto generated classes and other things that also impedes portability.

So it’s no wonder that app developers have gone for web apps (and QT) because it’s makes it easier to write portable apps and acts as insulation from Microsoft’s mercurial view of the world.

After 16 Years and $8 Billion, the Military’s New GPS Software Still Doesn’t Work

Posted by BeauHD View on SlashDot Skip
An anonymous reader quotes a report from Ars Technica:
Last year, just before the Fourth of July holiday, the US Space Force officially took ownership of a new operating system for the GPS navigation network, raising hopes that one of the military’s most troubled space programs might finally bear fruit. The GPS Next-Generation Operational Control System, or OCX, is designed for command and control of the military’s constellation of more than 30 GPS satellites. It consists of software to handle new signals and jam-resistant capabilities of the latest generation of GPS satellites, GPS III, which started launching in 2018. The ground segment also includes two master control stations and upgrades to ground monitoring stations around the world, among other hardware elements.

RTX Corporation, formerly known as Raytheon, won a Pentagon contract in 2010 to develop and deliver the control system. The program was supposed to be complete in 2016 at a cost of $3.7 billion. Today, the official cost for the ground system for the GPS III satellites stands at $7.6 billion. RTX is developing an OCX augmentation projected to cost more than $400 million to support a new series of GPS IIIF satellites set to begin launching next year, bringing the total effort to $8 billion.

Although RTX delivered OCX to the Space Force last July, the ground segment remains nonoperational. Nine months later, the Pentagon may soon call it quits on the program. Thomas Ainsworth, assistant secretary of the Air Force for space acquisition and integration, told Congress last week that OCX is still struggling.
The GAO found the OCX program was undermined by “poor acquisition decisions and a slow recognition of development problems.” By 2016, it had blown past cost and schedule targets badly enough to trigger a Pentagon review for possible cancellation.
Officials also pointed to cybersecurity software issues, a “persistently high software development defect rate,” the government’s lack of software expertise, and Raytheon’s “poor systems engineering” practices. Even after the military restructured the program, it kept running into delays and overruns, with Ainsworth telling lawmakers, “It’s a very stressing program” and adding, “We are still considering how to ensure we move forward.”

This How To Fix The Problem

By Marlin Schwanke • Score: 5, Funny Thread
Turn the project over to the Ukrainian they’d get it fixed so they could use the full military precision version of our GPS for themselves.

I wish it was corruption - it’s bad management

By FeelGood314 • Score: 5, Interesting Thread
You can fix corruption. I’ve been an embedded programmer on constrained devices for over 30 years. Mostly as a contractor fixing messed up projects. The quality of embedded projects slowly declined in the 2000s and 2010s but has fallen off a cliff over the last 5 years. It used that the quality of embedded developers was higher than enterprise developers (there was a higher minimum bar just to get your code to run). It used to be that you had to plan and organize your projects and the projects were run by engineers. You used to get support from your suppliers. In the 2010s part suppliers gave up on support and direct you to forums. When you do get documentation from suppliers you have to guess at which parts are correct and what steps they have left out. I had one formerly reputable supplier fail for 3 months to get “Hello World” to work on their dev board.

Then there is the management of projects. Now the de facto person in charge is the one in charge of the Jira tickets. They decide what tasks get resources and they decide which engineers do what tasks. They can’t understand a development plan, they can’t build the code, they usually can’t even use the product. What they can do is take bugs from testing, create tickets and assign engineers to the tickets. Their only metric is how fast tickets are resolved. They don’t care if the engineer they assign knows nothing about the project

The most successful embedded engineers today
Don’t put useful comments their code (they may describe how the code works but the code says what it does)
They never document (and most projects don’t even have a single place to put documents or have a way to find them)
They are good at volunteering for easy or high visible tickets
They close bugs by creating global variables that track the condition of the bug, then adding a function that is called all the time, a function that then checks the globals for the error condition, prevents (or masks) the error and then (hopefully) cleans up the globals without creating too many other bugs.
- This means they never have to understand the code. If there is automated testing they might never even need to know how to use the product.

I could do it cheaper

By marcle • Score: 5, Funny Thread

I’m not a programmer, but I could make a system that doesn’t work for only $4B. I’m only discounting it because I want to save taxpayer money.

Where is DOGE now?

By edi_guy • Score: 5, Insightful Thread

Subject is the question. Where is DOGE on the big stuff? The Pentagon wastes more every month in fraud, waster, and abuse than USAID spends annually. But somehow charity gets the axe and Ratheon keeps getting multi-billion dollar contracts, no strings attached. Can anyone put aside the woke distraction and look at the serious problems?!?

Not Surprised

By Bahbus • Score: 5, Insightful Thread

RTX Corporation, formerly known as Raytheon, hasn’t produced a single thing of halfway decent quality in many decades and only hires the shittiest engineers.

Samsung Is Bringing AirDrop-Style Sharing to Older Galaxy Devices

Posted by BeauHD View on SlashDot Skip
Samsung is reportedly planning to roll out AirDrop-style file sharing for older Galaxy phones via a Quick Share update. Early reports suggest the feature is appearing on devices from the Galaxy S22 through the S25, though it is not actually working yet. Android Central reports:
As spotted by Reddit users (via Tarun Vats on X), a Quick Share app update is rolling out via the Galaxy Store on older Samsung devices that appears to add support for AirDrop file sharing with Apple devices. Users report seeing the same new “Share with Apple devices” section we first saw on Galaxy S26 devices in the Settings app after updating Quick Share.

The update is reportedly showing up on Galaxy models ranging from the Galaxy S22 to last year’s Galaxy S25 series. The catch, however, is that the feature doesn’t seem to be working yet. It’s appearing on devices running One UI 8 as well as the One UI 8.5 beta, but enabling the toggle doesn’t activate the functionality for now.

Users say that turning on the feature doesn’t make their device visible to Apple devices, and no Apple devices show up in Quick Share either. It’s possible Samsung or Google still needs to enable it server-side, but it does confirm that broader rollout to older Galaxy devices is coming. The feature could arrive fully with the One UI 8.5 update.

LocalSend already works with everything

By slaker • Score: 3 Thread

I don’t know why I should care about limited compatibility for a subset of devices with another subset of devices. There’s some of everything in my home. I found a tool called LocalSend years ago that allows me to do mildly obnoxious data transfers between arbitrary devices regardless of platform.

OkCupid Settles FTC Case On Alleged Misuse of Its Users’ Personal Data

Posted by BeauHD View on SlashDot Skip
OkCupid and parent company Match Group settled an FTC case dating back to 2014 over allegations that the dating app shared users’ photos and other personal data with a third party without proper disclosure or opt-out rights. Engadget reports:
According to the FTC, OkCupid’s privacy policy at the time noted that the company wouldn’t share a user’s personal information with others, except for some cases including “service providers, business partners, other entities within its family of businesses.” However, the lawsuit accused OkCupid of sharing three million photos of its users to Clarifai, which the FTC claims is a “unrelated third party” that didn’t fall under the allowed entities. On top of that, the lawsuit alleged that OkCupid didn’t inform its users of this data sharing, nor give them a chance to opt out.

Moving forward, the settlement would “permanently prohibit” Match Group, which owns OkCupid, and Humor Rainbow, which operates OkCupid, from misrepresenting what kind of personal information it collects, the purpose for collecting the data and any consumer choices to prevent data collection. Even after the 2014 incident, OkCupid was found with security flaws that could’ve exposed user account info but, which were quickly patched in 2020.

Every time

By Valgrus Thunderaxe • Score: 5, Insightful Thread
a company says they won’t share personal data, they do just that. Always, and without fail.

Life With AI Causing Human Brain ‘Fry’

Posted by BeauHD View on SlashDot Skip
fjo3 shares a report from France 24:
Too many lines of code to analyze, armies of AI assistants to wrangle, and lengthy prompts to draft are among the laments by hard-core AI adopters. Consultants at Boston Consulting Group (BCG) have dubbed the phenomenon “AI brain fry,” a state of mental exhaustion stemming “from the excessive use or supervision of artificial intelligence tools, pushed beyond our cognitive limits.”

The rise of AI agents that tend to computer tasks on demand has put users in the position of managing smart, fast digital workers rather than having to grind through jobs themselves. “It’s a brand-new kind of cognitive load,” said Ben Wigler, co-founder of the start-up LoveMind AI. “You have to really babysit these models.” […] “There is a unique kind of reward hacking that can go on when you have productivity at the scale that encourages even later hours,” Wigler said.

[Adam Mackintosh, a programmer for a Canadian company] recalled spending 15 consecutive hours fine-tuning around 25,000 lines of code in an application. “At the end, I felt like I couldn’t code anymore,” he recalled. “I could tell my dopamine was shot because I was irritable and didn’t want to answer basic questions about my day.”

BCG recommends in a recently published study that company leaders establish clear limits regarding employee use and supervision of AI. However, “That self-care piece is not really an America workplace value,” Wigler said. “So, I am very skeptical as to whether or not its going to be healthy or even high quality in the long term.”
Notably, the report says everyone interviewed for the article “expressed overall positive views of AI despite the downsides.” In fact, a recent BCG study actually found a decline in burnout rates when AI took over repetitive work tasks.

As expected

By MpVpRb • Score: 5, Insightful Thread

It takes a while to learn how to effectively use new tech, especially powerful tech that is rapidly changing
Expect more confusion and disruption before things stabilize

Re: Thinking vs drudge work

By Big Hairy Gorilla • Score: 5, Insightful Thread
I don’t think you’re wrong but.. we aren’t just talking about “thinking”. You can’t analyze code without experience. Junior people can’t understand without having DONE stuff. They won’t have anything to compare too.

Re:25,000 lines of code

By toxonix • Score: 5, Insightful Thread

It takes a lot of time to create GOOD code with LLMs. The first thing it generates might be good, but not good enough to ship. All the happy-path tests and unnecessary string equals checks (like testing that a hard coded message is the exact string we specified… come on now) aren’t going to tell you about all the edge cases you missed. It can only generate what you tell it to. There will be bugs.

Reviewing code is more effort than writing code

By BytePusher • Score: 5, Interesting Thread
I’ve been writing software for 30 years. Honest code reviews take more mental effort than writing the code yourself, unless the changes are small and clearly and verifiably well tested. Proper design for unit testing is hard and beyond the capabilities of AI. Hence, you can’t really do better than a human software engineer, yet.

Re: 25,000 lines of code

By liqu1d • Score: 5, Funny Thread
That’s why I just run a find and replace removing line breaks. Removes so many lines of code.

Judge Allows BitTorrent Seeding Claims Against Meta, Despite Lawyers ‘Lame Excuses’

Posted by BeauHD View on SlashDot Skip
An anonymous reader quotes a report from TorrentFreak:
In an effort to gather material for its LLM training, Meta used BitTorrent to download pirated books from Anna’s Archive and other shadow libraries. According to several authors, Meta facilitated the infringement of others by "seeding” these torrents. This week, the court granted the authors permission to add these claims to their complaint, despite openly scolding their counsel for “lame excuses” and “Meta bashing.” […] The judge acknowledged that the contributory infringement claim could and should have been added back in November 2024, when the authors amended their complaint to include the distribution claim. After all, both claims arise from the same factual allegations about Meta’s torrenting activity.

“The lawyers for the named plaintiffs have no excuse for neglecting to add a contributory infringement claim based on these allegations back in November 2024,” Judge Chhabria wrote. The lawyers of the book authors claimed that the delay was the result of newly produced evidence that had “crystallized” their understanding of Meta’s uploading activity. However, that did not impress the judge. He called it a “lame excuse” and “a bunch of doubletalk,” noting that if the missing discovery truly prevented the contributory claim from being added in November 2024, the same logic would have prevented the distribution claim from being added at that time as well. “Rather than blaming Meta for producing discovery late, the plaintiffs’ lawyers should have been candid with the Court, explaining that they missed an issue in a case of first impression..,” the order reads.

Judge Chhabria went further, noting that the authors’ law firm, Boies Schiller, showed “an ongoing pattern” of distracting from its own mistakes by attacking Meta. He pointed specifically to the dispute over when Meta disclosed its fair use defense to the distribution claim, which we covered here recently, characterizing it as a false distraction. “The lawyers for the plaintiffs seem so intent on bashing Meta that they are unable to exercise proper judgment about how to represent the interests of their clients and the proposed class members,” the order reads. Despite the criticism, Chhabria granted the motion. […] For now, the case moves forward with a fourth amended complaint, three new loan-out companies added as named plaintiffs, and a growing list of BitTorrent-related claims for Judge Chhabria to resolve.

Microsoft Copilot Is Now Injecting Ads Into Pull Requests On GitHub

Posted by BeauHD View on SlashDot Skip
Microsoft Copilot is reportedly injecting promotional “tips” into GitHub pull requests, with Neowin claiming more than 1.5 million PRs have been affected by messages advertising integrations like Raycast, Slack, Teams, and various IDEs. From the report:
According to Melbourne-based software developer Zach Manson, a team member used the AI to fix a simple typo in a pull request. Copilot did the job, but it also took the liberty of editing the PR’s description to include this message: “Quickly spin up Copilot coding agent tasks from anywhere on your macOS or Windows machine with Raycast.” A quick search of that phrase on GitHub shows that the same promotional text appears in over 11,000 pull requests across thousands of repositories. Even merge requests on GitLab aren’t safe from the injection.

So what’s happening? Well, Raycast has a Copilot extension that can do things like create pull requests from a natural language command. The ad directly names Raycast, so you might think that Raycast is injecting the promo into the PRs to market its own app. But it is more likely that Microsoft is the one doing the injecting. If you look at the raw markdown of the affected pull requests, there is a hidden HTML comment, “START COPILOT CODING AGENT TIPS” placed right just before the ad tip. This suggests Microsoft is using the comment to insert a “tip” that points back to its own developer ecosystem or partner integrations.
UPDATE: Following backlash from developers, Microsoft has removed Copilot’s ability to insert “tips” into pull requests. Tim Rogers, principal product manager for Copilot at GitHub, said the move was intended “to help developers learn new ways to use the agent in their workflow.”

“On reflection,” Rogers said he has since realized that letting Copilot make changes to PRs written by a human without their knowledge “was the wrong judgement call.”

Could be worse

By Locke2005 • Score: 5, Funny Thread
They could have Clippy pop up and say things like, “It looks like you are trying to insert trojans into this encryption code. Would you like help with that?”

Clever approach

By TwistedGreen • Score: 5, Interesting Thread

I really like this approach. The next step would be to put ads directly into the source code. Soon you’ll see comments like

/**
** THIS FUNCTION WAS BROUGHT TO YOU BY EXPRESSVPN
** USE CODE COPILOT26 FOR 90% OFF THE 3 YEAR PLAN
**/

You could quickly monetize your open source projects this way.

Re:Enshitification of Github Proceeds Apace

By Narcocide • Score: 5, Insightful Thread

Leela: “Didn’t you have ads in the 20th century?”
Fry: “Well sure, but not in our dreams! Only on tv and radio…and in magazines…and movies. And at ball games, on buses, and milk cartons, and t-shirts, and bananas, and written on the sky. But not in dreams! No sirree.”

Title Correction:

By Sebby • Score: 5, Insightful Thread

Microsoft Copilot Is Now Injecting Ads Into Pull Requests On GitHub

Microslop Copilot Is Now Injecting Ads Into Pull Requests On GitHub

There FTFY.

Reminder everyone

By thegarbz • Score: 5, Insightful Thread

Microslop does not like you calling them Microslop. It is offensive to call their slop slop and associated slop with Microslop.

Sony Shuts Down Nearly Its Entire Memory Card Business Due To SSD Shortage

Posted by BeauHD View on SlashDot Skip
For the “foreseeable future,” Sony says it has stopped accepting new orders for most of its CFexpress and SD memory card lines due to the an ongoing memory supply shortage. “Due to the global shortage of semiconductors (memory) and other factors, it is anticipated that supply will not be able to meet demand for CFexpress memory cards and SD memory cards for the foreseeable future,” the company said in a notice. “Therefore, we have decided to temporarily suspend the acceptance of orders from our authorized dealers and from customers at the Sony Store from March 27, 2026 onwards. PetaPixel reports:
The suspension includes all of Sony’s memory card lines, including CFexpress Type A, CFexpress Type B, and SD cards. The 240GB, 480GB, 960GB, and 1920GB capacity Type A cards have been suspended, as have the 480GB and 240GB Type B cards. The full gamut of Sony’s high-end SD cards has also been suspended, including the 256GB, 128GB, and 64GB TOUGH-branded cards and the lower-end 512GB, 256GB, 128GB, and 256GB plainly-branded Sony cards, which cap out at V60 speeds. Even Sony’s lower-end, V30 128GB and 64GB SD cards have been suspended, showcasing that the SSD shortage affects all types of solid state, not just the high-end ones.

It appears that only the 960GB CFexpress Type B card and the lowest-end SF-UZ series SD cards remain in production. However, those UHS-I SD cards are discontinued in the United States outside of a scant few retailers and resellers. “We sincerely apologize for any inconvenience this may cause our customers,” Sony concludes.

Dear Tech Gods

By Tablizer • Score: 4, Funny Thread

Make the AI bubble pop already. I’ll stop wanking off for a year if you pop it, I swear on my rosie palm.

What good is AI

By ebunga • Score: 5, Insightful Thread

If there are no computers that are able to access it and no new data to feed it?

Sam Altman’s response

By 93 Escort Wagon • Score: 4, Funny Thread

“You don’t really need to take pictures anyway - let our AI create any image you want!”

Tech CEOs Suddenly Love Blaming AI For Mass Job Cuts

Posted by BeauHD View on SlashDot Skip
An anonymous reader quotes a report from the BBC:
Sweeping job cuts at Big Tech companies have become an annual tradition. How executives explain those decisions, however, has changed. Out are buzzwords like efficiency, over-hiring, and too many management layers. Today, all explanations stem from artificial intelligence (AI). In recent weeks, giants including Google, Amazon, Meta, as well as smaller firms such as Pinterest and Atlassian, have all announced or warned of plans to shrink their workforce, pointing to developments in AI that they say are allowing their firms to do more with fewer people. […] But explaining cuts by pointing to advances in AI sounds better than citing cost pressures or a desire to please shareholders, says tech investor Terrence Rohan, who has had a seat on many company boards. “Pointing to AI makes a better blog post,” Rohan says. “Or it at least doesn’t make you seem as much the bad guy who just wants to cut people for cost-effectiveness.”

That does not mean there is no substance behind the words, Rohan added. Some of the companies he’s backing are using code that is 25% to 75% AI-generated. That is a sign of the real threat that AI tools for writing code represent to jobs such as software developer, computer engineer and programmer, posts once considered a near-guarantee of highly paid, stable careers. “Some of it is that the narrative is changing, some of it is that we really are starting to see step changes in productivity,” Anne Hoecker, a partner at Bain who leads the consultancy’s technology practice, says of the recent job cuts. “Leaders more recently are seeing these tools are good enough that you really can do the same amount of work with fundamentally less people.”

There is another way that AI is driving job cuts — and it has nothing to do with the technical abilities of coding tools and chatbots. Amazon, Meta, Google and Microsoft are collectively planning to pour $650 billion into AI in the coming year. As executives hunt for ways to try to ease investor shock at those costs, many are landing on payroll, typically tech firms’ single biggest expense. […] Although the expense of, for example, 30,000 corporate Amazon employees is dwarfed by that company’s AI spending plans, firms of this size will now take any opportunity to cut costs, Rohan says. “They’re playing a game of inches,” Rohan says of cuts at Big Tech firms. “If you can even slightly tune the machine, that is helpful.” Hoecker says cutting jobs also signals to stock market investors worried about the “real and huge” cost of AI development that executives are not blithely writing blank cheques. “It shows some discipline,” says Hoecker. “Maybe laying off people isn’t going to make much of a dent in that bill, but by creating a little bit of cashflow, it helps.”

Not me guv

By GeekWithAKnife • Score: 5, Insightful Thread
Honestly I didn’t go on a hiring jolly and double my workforce in a short period of time without really havung work for them.

It’s all because of AI!

Re:BS

By supremebob • Score: 5, Informative Thread

The annoying thing is that the “AI” seems to stand for Actually India, as they’re replacing US developers and testers with foreign contractors when the AI automation doesn’t work out as planned. It’s really the same outsourcing efforts that we saw in the 2000’s and 2010’s with a better cover story.

Insider perspective: AI helps with amnesia only

By Somervillain • Score: 5, Interesting Thread
The CEOs are lying and LLM-based AIs are greatly overrated. They’re helpful, but they’re more like an enhanced version of Stack Overflow. If you know what you’re doing, they slow you down. For example, I know Java really really really well. When I have Claude Sonnet or Opus generate Java, it takes them waaay longer than it takes me to write it, so I can’t be lazy and outsource it to them. It seriously takes the AI minutes to do something it takes me seconds to do.

OK, so what about things that take me minutes?…like writing a unit test?…well, that’s my favorite use case for AI. I’d LOVE to see it succeed, but I work primarily in Java, a compiled language…and it is strict about getting things right, so I see the errors immediately. Python users who vibe code, just ship bugs and let their users find them. OK, so with enough tries, it barfs out a unit test. It looks pretty good…afterall, LLMs are top-notch guessers. Unfortunately, the unit test is completely wrong and useless…so I have to go make it actually test the code instead of testing bean getters and setters and stupid shit like that. The scary part is that it looks good. It looks correct. But it often isn’t, so you have to evaluate line-by-line.

One of my coworkers is more bullish on AI and introduced over 20 bugs last week with his AI slop, including undoing half my fixes for the week. His boss is consider putting him on a Performance Improvement Plan for his AI use. He’s not a dumb guy. He just didn’t understand the pieces I worked on, didn’t read my docs and comments, and was fooled by the AI when it undid all my code to make his component’s test pass. He is in India and didn’t wait for me to review the code and had someone in IST review it who knew even less.

The only powerful use case I’ve found for them is for scenarios where you need to work with a technology you used to know well, but have forgotten. As a backend software engineer, this would be front-end code, RegExes, obscure stored procedure method calls, etc. For RegExes, I write them maybe 2x a year…so I never am confident of things I write. I can review the code better than I can write it from scratch.

If you’ve never used a technology, the code is unreliable. At best, it might save you some time learning. For example, if I had to write something in C#, a language I barely touched 20 years ago, it might double the ramp-up time, but I’d still have to spend a lot of time learning the fundamentals of the language. It would take the place of a really well written book…helpful, but not a game changer.

The point being…AI doesn’t tangibly save time. It might save a bit under some circumstances, but not enough to justify layoffs. The CEOs are full of shit. They’re AI washing routine layoffs. Either they overhired, or they wanted to shut down products and features or they wanted to get rid of dead weight....but apparently it’s more fashionable to overtly lie to investors? It baffles me why shareholders haven’t filed a lawsuit against Beinhoff....or any other CEO.

Re:BS

By ranton • Score: 5, Interesting Thread

The CEOs of these companies are trying to justify inflated stock prices that were high based on the expectation of future growth.

No, CEOs are trying to show their board, investors, and activist investors that they have a plan for how to take advantage of AI and can at least keep up with their competitors use of AI, if not surpass them. I work at a large enterprise (close to 50k employees) and VPs are being told that they need to find ways for AI to have an impact on their department or their leaders will find someone who can. If it isn’t happening fast enough consultants are brought in to take over their department’s transformative roadmap and leaders who can’t keep up are relegated to being SMEs until they are eventually replaced. I’m not in the room when that message is given, but I’ve seen the rapid shift of VPs who were raising alarms nearly immediately turn into AI cheerleaders.

If you work for a publicly traded or VC backed company I assure you your CEO does not have a choice on whether to jump on the AI bandwagon. That’s not how hype driven bubbles work.

Re:Insider perspective: AI helps with amnesia only

By Locke2005 • Score: 5, Insightful Thread
Remember how soon after they gave you a phone that remembers all your phone numbers for you, you couldn’t remember anybody’s phone number without your phone?

Remember how people used to add up numbers by hand until it became easier to do it with a calculator, and now they can’t do match?

One wonders what other skills will atrophy due to AI reliance.

New Company Hopes to Build Age-Verification Tech into Vape Cartridges

Posted by EditorDavid View on SlashDot Skip
Their goal is to use biometric data and blockchain to build age-verification measures directly into disposable vape cartridges.

Wired reports on a partnership between vape/cartridge manufacturer Ispire Technology and regulatory consulting company Chemular (which specializes in the nicotine market) — which they’ve named “Ike Tech”:
[Using blockchain-based security, the e-cig cartridge] would use a camera to scan some form of ID and then also take a video of the user’s face. Once it verifies your identity and determines you’re old enough to vape, it translates that information into anonymized tokens. That info goes to an identity service like ID.me or Clear. If approved, it bounces back to the app, which then uses a Bluetooth signal to give the vape the OK to turn on.

“Everything is tokenized,” [says Ispire CEO Michael Wang]. “As a result of this process, we don’t communicate consumer personal private information.” He says the process takes about a minute and a half… After that onetime check, the Bluetooth connection on the phone will recognize when the vape cartridge is nearby and keep it unlocked. Move the vape too far away from the phone, and it shuts off again. Based on testing, the companies behind Ike Tech claim this process has a 100 percent success rate in age verification, more or less calling the tech infallible. “The FDA told us it’s the holy grail technology they were looking for,” Wang says. “That’s word-for-word what they said when we met with them....”

Wang says the goal is to implement additional features in the verification process, like geo-fencing, which would force the vape to shut off while near a school or on an airplane. In the future, the plan is to license this biometric verification tech to other e-cig companies. The tech may also grow to include fingerprint readers and expand to other product categories; Wang suggests guns, which have a long history of age-verification features not quite working.

Re:Blockchain???

By Zarhan • Score: 5, Informative Thread

They should add AI and quantum computing into the mix just for maximum buzzword coverage.

Glad I don’t smoke

By Powercntrl • Score: 5, Insightful Thread

I already hate that I need a smartphone app to charge my EV at most DCFC stations (the one saving grace is that I don’t need to fast charge all that often), but having to use an app every time you want to get your nicotine fix would be a real pain in the ass. Something tells me if this actually caught on, vapers would just go back to smoking the old fashioned combustion form of cancer sticks.

Re: Glad I don’t smoke

By ThurstonMoore • Score: 5, Informative Thread

The world has become so horrible in the last 10-15 years.

Re:Blockchain???

By cayenne8 • Score: 5, Funny Thread
Fuck it....

I’m gonna just go back to smoking real cigarettes....

It was MUCH more fun anyway…you got to carry a lighter all the time, play with fire....and flicking ashes at the bar while talking to a girl just felt....right.

Hell, maybe go back a bit further and buy loose tobacco and roll my own.

Pure analog pleasure.....geez I miss it.....

Stop That

By bill_mcgonigle • Score: 5, Insightful Thread

You people have gone insane.

Stop trying to control every atom of existence and every move people make.

You’re sick in the head, not visionaries, not thought leaders.

Go plant a garden and get back in touch with the real world.

No, NOT FARMVILLE!

Apple’s Early Days: Massive Oral History Shares Stories About Young Wozniak and Jobs

Posted by EditorDavid View on SlashDot Skip
Apple’s 50th anniversary is this week — and Fast Company’s Harry McCracken just published an 11,000-word oral history with some fun stories from Apple’s earliest days and the long and winding road to its very first home computers:
Steve Wozniak, cofounder, Apple: I told my dad when I was in high school, “I’m going to own a computer someday.” My dad said, “It costs as much as a house.” And I sat there at the table — I remember right where we were sitting — and I said, “I’ll live in an apartment.” I was going to have a computer if it was ever possible. I didn’t need a house.
Woz even remembers trying to build a home computer early on with a teenaged Steve Jobs and Bill Fernandez from rejected parts procured from local electronics companies. Woz designed it — “not from anybody else’s design or from a manual. And Fernandez was one of those kids that could use a soldering iron.”
Bill Fernandez: The computer was very basic. It was working, and we were starting to talk about how we could hook a teletype up to it. Mrs. Wozniak called a reporter from the San Jose Mercury, and he came over with a photographer. We set up the computer on the floor of Steve Wozniak’s bedroom.

Well, the core integrated circuit that ran the power supply that I built was an old reject part. We turned on the computer, and the power supply smoked and burnt out the circuitry. So we didn’t get our photos in the paper with an article about the boy geniuses.
But within a few years Jobs and Wozniak both wound up with jobs at local tech companies. Atari cofounder Nolan Bushnell remembers that Steve Jobs “wasn’t a good engineer, but he was a great technician. He was pristine in his ability to solder, which was actually important in those days.” Meanwhile Allen Baum had shared Wozniak’s high school interest in computers, and later got Woz a job working at Hewlett-Packard — where employees were allowed to use stockroom parts for private projects. (“When he needed some parts, even if we didn’t have them, I could order them.”) Baum helped with the Apple I and II, and joined Apple a decade later.

Wozniak remembers being inspired to build that first Apple I by the local Homebrew Computing Club, people “talking about great things that would happen to society, that we would be able to communicate like we never did [before] and educate in new ways. And being a geek would be important and have value.” And once he’d built his first computer, “I wanted these people to help create the revolution. And so I passed out my designs with no copyright notices — public domain, open source, everything. A couple of other people in the club did build it.”

But Woz and Jobs had even tried pitching the computer as a Hewlett-Packard product, Woz remembers:
Steve Wozniak: I showed them what it would cost and how it would work and what it could do with my little demos. They had all the engineering people and the marketing people, and they turned me down. That was the first of five turndowns from Hewlett-Packard. Steve Jobs and I had to go into business on our own.
In the end, Randy Wigginton, Apple employee No. 6 remembers witnessing Jobs, Wozniak, and Ronald Wayne the signing of Apple’s founding contract, “which is pretty funny, because I was 15 at the time.” And it was Allen Baum’s father who gave Wozniak and Jobs the bridge loan to buy the parts they’d need for their first 500 computers.

After all the memories, the article concludes that “Trying to connect every dot between Apple, the tiny, dirt-poor 1970s startup, and Apple, the $3.7 trillion 21st-century global colossus, is impossible.”
But this much is clear: The company has always been at its best when its original quirky humanity and willingness to be an outlier shine through.

Mark Johnson, Apple employee No. 13: I was in Cupertino just yesterday. It’s totally different. They own Cupertino now.

Jonathan Rotenberg, who cofounded the Boston Computer Society in 1977 at age 13: People want to hate Apple, because it is big and powerful. But Apple has an underlying moral purpose that is immensely deep and expansive…

Mike Markkula, the early retiree from Intel whose guidance and money turned the garage startup into a company: The culture mattered. People were there for the right reasons — to build something transformative — not just to make money. That alignment produced extraordinary results…

Steve Wozniak: Everything you do in life should have some element of joy in it. Even your work should have an element of joy… When you’re about to die, you have certain memories. And for me, it’s not going to be Apple going public or Apple being huge and all that. It’s really going to be stories from the period when humble people spotted something that was interesting and followed it

I’ll be thinking of that when I die, along with a lot of pranks I played. The important things.

Re:Wozniak - the real reason for Apple

By Petersko • Score: 5, Insightful Thread

It’s too convenient to just write off Jobs. The truth is somewhere in the middle, as it always is. The idea that plenty of others could have done what he did is just too dismissive. When he died the company was worth a third of a trillion dollars. Not just any sociopath can pull that off.

Re:Wozniak - the real reason for Apple

By serviscope_minor • Score: 5, Insightful Thread

Jobs gets all the accolades and fame but he was just a pushy sociopath in a suit,

Suit? The guy who famously wore a black turtleneck all the time?

Anyhoo. I think people outside tech overestimate the importance of CEOs and people in tech underestimate it. Without Jobs, Woz probably would have been a really great engineer in some company and you’d never have heard of him at all. He wasn’t a product guy, and you need a product not just raw tech to sell. Selling stuff being somewhat important for a company.

Steve Jobs also had a functioning reality distortion field, something not all that many people have and that’s really important for building a company…

is Apple the only one?

By v1 • Score: 5, Insightful Thread

Of all the early computer start-ups, Apple is the only “started in the garage on a shoestring budget and passion to create something everyone would love” that I can recall hearing about. Were they they only ones to get started like that?

And I see so many people already trash-talking Jobs… business sense without a great product has nothing, but tech genius without business never takes off. Both are necessary! It takes a good product and a good salesman to make a successful brand. Apple was fortunate to have both, it was their recipe for success.

Re:Wozniak - the real reason for Apple

By tlhIngan • Score: 5, Informative Thread

Jobs gets all the accolades and fame but he was just a pushy sociopath in a suit, plenty of others could have done what he did. VERY few could have done what Wozniak did and its a damn shame that not many people outside of the tech world have heard of him.

That is false. Jobs and Wozniak actually are the yin and yang of Apple. Wozniak by himself, left to his own devices, would still be working at HP. Jobs by himself, would have been a has-been engineer. Jobs was actually competent as an engineer (unlike say, other “engineers” like Musk).

Jobs though, was more tuned into the business side of things than the engineering side of things, while Woz was the opposite.

Woz and Jobs got started by making a blue box - Jobs had read about them in the Esquire article, and Woz built one of the first digital blue boxes. Both of them went around Berkeley selling them to college students for $150 or so and they made a few thousand.

Jobs knew about computers, Woz built a computer. Woz was basically giving the Apple I away at the Homebrew Computer Club and it was one among dozens of others doing the same. Jobs had the business acumen to recognize he could do one better and sold it to a computer store and get the production of it going (requiring Woz to sell his HP-35 calculator). They’d build 10 (all they could afford), sell them and use the money to build 20 and so on.

Wpz designed the hardware. Jobs saw the potential and could leverage the confidence he had to not just sell it, but to get it produced - arranging the suppliers to give them 30 days credit.

These days it’s a lot easier since if you want something built, China can handle the production if you meet the minimum order quantity. But back then, it’s not like there was a huge electronics supply chain, production lines, or anything else.

Both Jobs and Woz were soldering Apple Is in that garage too - like I said, Jobs was an OK engineer, but he knew talent. Woz was an excellent engineer, but was happy at HP and didn’t really have the desire to start his own company.

It’s Yin and Yang - you need both, which is how Apple got started. Woz would likely have kept this computer as a nifty prototype then bought a Commodore when they came out a few years later whilst still at HP. Jobs would likely have drifted among various electronics companies (he was at Atari) once the crash happened.

You have to remember Jobs went and found NeXT after Macintosh got the Apple board to oust him. He sold his Apple stock and basically created NeXT. He used the earnings at NeXT to basically backstop Pixar (who was struggling and about to go under) and eventually fund Toy Story.

And he brought the second coming to Apple, recognizing talent in Jony Ives to design a computer so unique everyone knew what it was.

Doesn’t excuse Jobs being an asshole, though. The only redeeming personal quality Jobs had is that he managed his RDF (reality distortion field) to push the people who work for him to do their best work. He was a pain to work for, but if you actually do good, he did reward you to encourage you to do more great work.

Re:Wozniak - the real reason for Apple

By Guspaz • Score: 4, Informative Thread

Clive Sinclair’s company collapsed within five years of shipping their first computer. Perhaps not a good counter-example.

Rivian and Lucid Win Right to Sell Their EVs Directly to Buyers in Washington State

Posted by EditorDavid View on SlashDot
The Wall Street Journal reports that Rivian “just won a yearslong battle with car dealers in Washington state that threatens the model of how cars are sold.”
After fighting to sell its vehicles directly to buyers, Rivian threatened to take its case to voters with a ballot measure to permit direct sales. The dealers blinked. The state’s dealer lobby not only dropped its opposition to a sales loophole for Rivian and rival EV-maker Lucid, but also encouraged lawmakers to approve one. The measure became law this month…

New auto entrants like Rivian, and Tesla before it, have spent years contending with long-established U.S. state laws that require new cars to be sold through independent franchised dealers. The auto startups — typically makers of EVs — argue that they can offer a better experience by selling directly to consumers, much as Apple sells iPhones through its own stores and online. Rivian CEO RJ Scaringe has said the company is committed to direct-only sales because it’s more profitable and gives the company control over how its vehicles are sold, marketed and maintained. The Washington compromise riled traditional automakers, including General Motors, Ford and Toyota, which lobbied against it, arguing it unfairly advantages startups. A trade group representing the automakers called it discriminatory and argued the exception could one day open the door to Chinese EV makers…

German automaker Volkswagen is currently facing several lawsuits from dealers over its plan to sell new Scout vehicles directly to consumers. Dealers say independent franchises are vital to the car-buying process, creating competition between dealerships that keeps prices affordable for consumers, while providing valuable services such as repairs, warranty work and financing… Yet for Washington’s dealers, the prospect of putting franchise laws up for a popular vote laid bare a tough reality: given the choice, many car buyers want the freedom to avoid dealerships. Rivian’s polling, which the company shared with lawmakers, showed nearly 70% of respondents favored allowing direct sales when asked whether they would support manufacturers selling cars directly to consumers…

The fight comes at a critical time for Rivian, which is launching a new, more affordable SUV in a bid to make consistent profits amid a downturn in U.S. EV sales… Rivian is able to directly sell cars in roughly half of U.S. states, but a number of them limit how many locations the company can operate. They can’t disclose the price, though. For that, customers must go online.
The article notes that “Following the win, Rivian executives are eyeing other states that, like Washington, ban direct sales but also allow ballot initiatives: Arkansas, Ohio, Oklahoma, Montana, Nebraska and South Dakota…” It adds that lawmakers (from both parties) in the state of Washington had said “they have long felt pulled between giving consumers more car-buying freedom and protecting dealers, essentially small-business owners who are vital to local economies — and politically powerful.”

But an executive at the Washington State Auto Dealers Association said dealers supported this new law partly because it protects them by barring future automakers from selling directly in the state, and by requiring Rivian and Lucid to adhere to the same regulations that govern how dealers operate.

Re:Was not expecting them to admit that

By swillden • Score: 5, Informative Thread

>arguing it unfairly advantages startups

Way to say your dealers suck.

They had to say it that way, because the more accurate statement is that the dealership law unfairly advantages existing automakers. It’s not about the dealerships being good or bad, it’s about the fact that setting up a dealership network takes a lot of time and money and requiring it is a good way to keep new competition out.

Another example of US archaism

By shilly • Score: 5, Interesting Thread

The US has fallen behind the rest of the developed world in so many aspects of life due to ossification of structures driven by regulatory capture and fragmentation. Dealerships have been nothing but pernicious for consumers for decades, keeping ICE sales higher than they’d otherwise be, keeping prices higher than they’d otherwise be, etc etc. The rest of the world looks on with incredulity that you find it so difficult to unfuck yourselves.

Re:I live in Washington state

By misnohmer • Score: 5, Insightful Thread
I live in Washington too. I’ve owned many cars, including multiple Teslas between 2013 and 2024. I will tell you that there is no clear winner between direct sales and dealer system. At first (think 2013-2018) Tesla service was amazing, they would bend over backwards to help the early adopters, then then Model 3, and then Model Y came, profitability became the top priority and they wouldn’t even cover a yellowing screen under warranty in a less than one year old six digit priced Model S. I saw people coming with videos of their Model 3’s malfunctioning, and Tesla service saying “we cannot reproduce it, therefore the problem fixed itself”. That is also when I realized that the manufacturer owned service service means there is no competition, so they can charge whatever they want - for example an $8 chip that failed in one of my Teslas causing the main screen no to boot costed $3,000 to fix (eventually there was a NHTSA forced recall, but not when I needed the fix, luckily I am capable of replacing a BGA EMMC part myself, but that is not within an average owner’s capabilities). I now own dealer sold cars, and have to tell you I am getting great service from the dealer (4 cars, 3 different manufacturers), despite the manufacturer’s inadequacies with modern technologies (yea, they suck at software). In the past I’ve owned many dealer sold brands, and my service experience varied. I’ve had some great experiences, and some horrible ones too. I remember long ago having an issue with a 2 year old Honda for which the dealer wanted a bunch of money to resurface the rotors, eventually getting new rotors replaced under Honda warranty, the service guy literally winking at me saying “hey, I gotta try to make money, eh?” (yep, he was Canadian ;-) ). On the other end of the spectrum I had a Lexus once which the dealer was willing to go to mat for me to lemon my car over a bluetooth hands free issue that Lexus (Toyota) was unable to resolve. Lexus actually sent an engineer from Japan to repair the problem (turned out to be a bent pin in one of the harnesses) tp avoid the car getting returned as a lemon.

Bottom line is that direct sales vs. dealer experience does not have a clear cut winner for consumers. I do however strongly believe that both should be allowed by law. This explicit allowing one manufacturer at a time political bribing shit in WA state is just government corruption (happens in other areas to, check out automatic card shufflers allowed in WA as an example, or charter school licenses). I say let people choose, do they wan to buy from a dealer or direct from manufacturer, let he market decide what the people want. Heck, allow the same manufacturer to sell via dealer and direct, see what people choose. I bet such a choice would spurn competition for dealers to show people why it’s worth it buying from them. And yes, I get that it would cost some dealers profits, in situation where they require people to buy a bunch of their cars to qualify for an allocation for a special car (e.g. Porsche 911 GT3 RS), or charge Additional Dealer Markup on top of MSRP (e.g. Corvette).

Sigh

By ledow • Score: 5, Funny Thread

Only in America could you legally argue that an unnecessary profit-making middle-man was legally required and that it would somehow “reduce costs”.

Re:I live in Washington state

By drinkypoo • Score: 5, Insightful Thread

I say let people choose, do they wan to buy from a dealer or direct from manufacturer,

I say do not let manufacturers choose, mandate that they release all documentation and software they create internally for service purposes. That’s what’s going to free consumers from tyranny of manufacturers and stealerships.