the unofficial Slashdot digest for 2019-Aug-11 today archive

Guido van Rossum Looks at Python's Past, Present, and Future

Posted by EditorDavidView on SlashDotShareable Link
This week from 63-year-old Python creator Guido van Rossum shared some interesting stories with ZDNet's senior reporter Nick Heath: While sharing software with the world today only takes a few clicks, in the 1980s it was an altogether more laborious affair, with van Rossum recalling the difficulties of trying to distribute Python precursor ABC. "I remember around '85, going on a vacation trip to the US, my first ever visit to the US, with a magnetic tape in my luggage," says van Rossum. Armed with addresses and phone numbers of people who had signalled an interest in ABC via the rudimentary email system available at the time -- which wasn't suited to handling anything as large as source code -- he travelled door-to-door posting the tapes. Despite this effort, ABC didn't really take off. "So, no wonder we didn't get very far with the distribution of ABC, despite all its wonderful properties," he says.

But as the internet revolution gathered steam, it would be much easier to distribute Python without a suitcase full of tapes. Van Rossum released Python to the world via the alt.sources newsgroup in 1991, under what was pretty much an open-source licence, six years before the term was first coined. While Python interpreter still had to be joined together into a compressed file from 21 separate parts and downloaded overnight on the Usenet network, it was still a vastly more efficient delivery mechanism than the hand deliveries of a few years earlier.

Guido also shared some new comments on why he stepped down as Python's Benevolent Dictator for Life: "I was very disappointed in how the people who disagreed technically went to social media and started ranting that the decision process was broken, or that I was making a grave mistake. I felt attacked behind my back," he says. "In the past, it had always been clear that if there were a decision to be made about a change in the language or an improved feature, a whole bunch of core developers would discuss the pros and cons of the thing. Either a clear consensus would appear or, if it was not so clear, I would mull it over in my head and decide one way or another. With PEP572, even though it was clearly controversial, I chose 'Yes, I want to do this', and people didn't agree to disagree.

"It wasn't exactly a revolt, but I felt that I didn't have the trust of enough of the core developer community to keep going."

He thinks the change in how disputes about the language play out is partly a result of how many people use Python today. "It's probably also the fact that the Python community is so much larger. It's harder to reach any form of consensus, of course, because there's always fringe dissidents, no matter which way you decide." Earlier this year, Python core developers -- those who work on maintaining and updating Python's reference CPython interpreter -- elected a steering council to oversee the future of the language. Van Rossum was elected, alongside Warsaw and fellow core developers Brett Cannon, Carol Willing, and Nick Coghlan.

Re:Actually pretty crappy

By weilawei • Score: 5, Insightful • Thread

If you're using Python for the performance critical core of your software (assuming you even have such an issue, rather than being user-bound), you may want to re-examine that. If you're writing your interface in C, you might also want to question that.

Not all tools are hammers, and not all problems are nails.


By DrYak • Score: 5, Funny • Thread

Wait, are you using the "smartquote on smartphone breaks /.'s UTF" excuse to try to sprinkle syntactically correct Perl in the middle of discussion about Python? That would be sneaky...

Re:Actually pretty crappy

By Ambassador Kosh • Score: 5, Interesting • Thread

I use python in HPC software along with a c++ simulator. The simulator uses up 99% of the time and the python parts takes up 1% of the time but there is no way I would ever want to write the python part in c++. That would be 100K loc or so, provide no real speedup but it would be a huge pain in the ass to write.

Python is great for command and control which is why it is so common to use in HPC. That is why you see python in things like AI and not things like java. If you want to use tensorflow then use python.

Re:Actually pretty crappy

By Ambassador Kosh • Score: 5, Interesting • Thread

Python is the defacto standard for most types of machine learning research. From neural networks with things like tensorflow to gradient boosting and gaussian process models. It is very easy to set them up in python and use them for what you need and pay close to 0% performance cost by handing off the heavy lifting to libraries written in c/c++. If you wanted to do the same thing in Java it would not run any faster but it would be more annoying to interface with the c/c++ libraries. Even stuff like BLAS and LAPACK from Java is more of a pain in the next with Java than Python.

My personal pet hate with Python...

By OneSmartFellow • Score: 4, Insightful • Thread
... is incompatibility (real or misunderstood) between what appears to be minor revisions.

Example:  Python 3.6 to 3.7
Upgrade broke all kinds of stuff, which subsequently forced me to have to rollback, etc...

This kind of thing is simply unacceptable.

Even Python 2 ->3 should not have broken anything, but for a point revision to break existing code indicates ..... basically that Python sucks

Does Quantum Cryptography Need a Reboot?

Posted by EditorDavidView on SlashDotShareable Link
"Despite decades of research, there's no viable roadmap for how to scale quantum cryptography to secure real-world data and communications for the masses," according to IEEE Spectrum.

Wave723 shares their report: A handful of companies now operate or pay for access to networks secured using quantum cryptography in the United States, China, Austria, and Japan. According to a recent industry report, six startups plus Toshiba are leading efforts to provide quantum cryptography to governments, large companies (including banks and financial institutions), and small to medium enterprises. But these early customers may never provide enough demand for these services to scale...

From a practical standpoint, then, it doesn't appear that quantum cryptography will be anything more than a physically elaborate and costly -- and, for many applications, largely ignorable -- method of securely delivering cryptographic keys anytime soon. This is in part because traditional cryptography, relying as it does on existing computer networks and hardware, costs very little to implement. Whereas quantum crypto requires an entirely new infrastructure of delicate single-photon detectors and sources, and dedicated fiber optic lines. So its high price tag must be offset by a proven security benefit it could somehow deliver -- a benefit that has remained theoretical at best.

Though it was supposed to replace mathematical cryptography, "Math may get the last laugh," the article explains. "An emerging subfield of mathematics with the somewhat misleading name ' post-quantum cryptography' now appears better situated to deliver robust and broadly scalable cryptosystems that could withstand attacks from quantum computers." They quote the security engineer at a New York cybersecurity firm who says quantum cryptography "seems like a solution to a problem that we don't really have."

The article ends by suggesting that research may ultimately be applicable to quantum computers -- which could then be used to defeat math-based cryptography. But riffing on the article's title, sjames (Slashdot reader #1,099) quips that instead of giving quantum cryptography a reboot, maybe it just needs the boot.

Re:Author is confusing 3 concepts

By swillden • Score: 5, Informative • Thread
The confusion is in the claim that the name "post-quantum cryptography" is misleading. The author confused the use of "quantum" in that phrase with the word in "quantum cryptography" when in fact it's from the word in "quantum computing". That confusion does make the name odd and misleading, but if you understand the referent, it makes perfect sense.

RSA and a Quantum computer

By FeelGood314 • Score: 5, Interesting • Thread
A quantum computer can find the period of a function. Consider RSA with n = 15. The numbers co-prime to 15, (1, 2, 4, 7, 8. 11. 13 14), form a multiplicative group modulo 15 (if you multiply any two of them together and take the remainder when dividing by 15 you get another member of the group). Also if I multiply any number in that group by itself a certain number of times I get that number back again. The number of times is the number of elements in the group plus 1. In this case my magic number is 9 because there are 8 elements of the group. So any number raised to the power of 3 and then raised again to the power of 3 would give the original number back again. Here my secret key would be 3 and my public key would be 3 also (OK - it isn't exciting with small numbers). So if you wanted to send me 7, you send 7^3 (mod 15) = 13. I then calculate 13^3 (mod 15) = 7. Now RSA is secure because I can give you really big n = p*q and the number of numbers in our multiplicative group is actually (p-1)*(q-1). Except if you don't know p and q then finding p and q and hence (p-1)*(q-1) is really really hard . A quantum computer though could find (p-1)*(q-1) directly since it is the period of the function f(n) = 2^n.

Comment removed

By account_deleted • Score: 3 • Thread
If you look at the Hype cycle for 'AI' there are now signs that it is finally starting to come down from the "peak of inflated expectations". Which is a good thing. Maybe we will be able to talk about software again without people shouting "use AI" all the time. QC on the other hand has just started climbing the peak curve. So just like with 'AI', get ready for 'QC' being applied fast and loose for things not even remotely relevant to the subject just to generate PR.

Re:Price too high?

By gweihir • Score: 5, Insightful • Thread

Security is risk management. It is about identifying, quantifying and then managing the risk. Managing a risk usually means reducing it, accepting it (possibly after reduction) or transferring it (e.g. by using insurance, also possibly after reduction, the risk is then accepted by "somebody else"). The criteria for accepting a risk are always a cost-benefit analysis in competent risk management.

The natural ceiling to the security cost equals the value of the target being attacked.

Not really. The natural ceiling is the most damage than can happen, which may not be limited to the target. You very rarely go that high though.

Does quantum crypto need a reboot?

By hey! • Score: 3 • Thread

Yes and no.

New Electric Motor Design Massively Boosts Power, Torque, and Efficiency

Posted by EditorDavidView on SlashDotShareable Link
A Texas-based, father/son team raised $4.5 million in seed funding to build "a remarkable electric motor technology," reports New Atlas.

Long-time Slashdot reader Namarrgon writes: Linear Labs' impressive new circumferential flux motor design (video) uses four rotors [where other motors typically run one or two] and a software-reconfigurable, multi-coil stator, enclosed in a 3D magnetic "torque tunnel" to maximize efficiency even at high speeds. The stator can be configured on the fly by regrouping coils to use a variable number of overlapping phases simultaneously, producing full torque smoothly at low rpms without torque pulsing, or changing speeds with no change to frequency, current, or voltage, like an electronic transmission. An innovative approach to field weakening by gradually misaligning permanent magnets allows efficiencies to actually climb as speeds increase.

These features produce a highly compact motor with two to five times the torque density, at least three times the power density and at least twice the total output of any conventional permanent magnet motor of the same size. This also eliminates the need for gearing in many applications, reducing costs and weight while gaining 10-20% more range from a given battery pack.

Linear Labs has received 21 patents so far, with another 29 pending, and their prototypes have been verified by independent expert tests. Recently they received $4.5 million in seed funding, and are planning to build them into car and scooter prototypes over the next couple of years.

Re:That reads like a press release..

By dotancohen • Score: 4 • Thread

You will see far more torn CV joint boots on Hyundai i30s than you will on Subaru Outbacks, though. The Subaru transmission sits higher off the ground, and has a bit of frame below it. The Hyundais have the inner CV joints almost at the level of the oil drain plug.

So there are design factors that dictate how long the components will last.

flux motor?

By sad_ • Score: 3 • Thread

"Linear Labs' impressive new circumferential flux motor design"

did they also make a flux capacitor to get along with it?


By Apostalypse • Score: 4, Interesting • Thread
Does this design work in reverse as a generator? If so, a gearboxless generator would be bigger news than a motor. The gearbox is a major source of failures in wind turbines, and it would dramatically decrease the complexity of all types of mechanical generators.

Re: That reads like a press release..

By careysub • Score: 5, Interesting • Thread

When efficiencies improve it always means we use more of that resource (electricity in this case). See Jevons Paradox.

A random factoid is a dangerous thing. The claim that efficiency improvements by themselves lead to greater usage (so obviously we save resources by being inefficient!) is not only not "always true" it has been very difficult to find any clear case of it occurring at all. Jevons, an economist in the mid 19th century, wrote a tome about this not based on actual data showing this to be the case but one entirely built on speculation.

What he was observing was the rapid rise in the consumption of coal at that time (you know, the Industrial Revolution, with the steam engine) and concurrent improvements in steam engine efficiency (and other uses of coal, like more efficient use of it for heating buildings). But were these increases in efficiency the cause of rising coal usage, or was something else (like the many, many uses to which steam engines were being put by the revolution) driving this demand independent of how efficient they were? The latter would seem to be the case, and thus efficiency improvements were definitely saving coal.

Have the recent advances in lighting efficiency led to increases in per capita electricity demand over the last decade? No, per capita electricity demand has dropped. Didn't Jevons "prove" this can't happen"? No, he did nothing of the kind.


By RedShoeRider • Score: 4, Interesting • Thread
Interestingly, New Jersey (USA) just banned almost exactly what you're describing, literally 3 days ago:

"Five other states have outlawed pet leasing: California, Indiana, Nevada, New York, and Washington. Connecticut lawmakers are considering a ban."

Soapbox: Growing old with your dogs is a privilege. /Soapbox

Amazon Ring Alert Leads To Capture of 'Extremely Dangerous' Escaped Convict

Posted by EditorDavidView on SlashDotShareable Link
ABC News describes how Amazon's surveillance doorbell cameras today led to the capture of an "extremely dangerous" inmate: Homicide suspect Curtis Watson, 44, escaped from work detail on a tractor at the West Tennessee State Penitentiary in Henning, Tennessee, about 50 miles northeast of Memphis, on Wednesday. The tractor was later found about a mile away from the prison. Around 3:30 a.m. Sunday morning, police received a tip from Henning residents Harvey and Anne Taylor that they believed they had video surveillance of Watson outside their home, Tennessee Bureau of Investigation Director David Rausch told reporters in a news conference.

The couple was woken up by an alarm from their Ring video doorbell system that alerted them someone was in their backyard, Harvey Taylor said. When they pulled up the screen, they saw a man looking in the refrigerator in their carport, but couldn't see his face. Once Watson closed the refrigerator door, Ann Taylor recognized Watson from his beard, and the couple called 911...

Within 30 minutes of receiving the Taylors' call, law enforcement officers from multiple agencies descended on the area, "which then kept it contained and controlled from that point forward," Rausch said.

Re:Mathematics, statistics, and downright fibs

By phantomfive • Score: 5, Insightful • Thread
The problem here is that people like their Ring doorbells. They like them enough to pay actual money for them (Facebook would die if it started charging for its whatever. No one likes Facebook). People want that surveillance functionality, and they like the spying.

Because those people know what they want, the rest of us can't act like foaming-mouth lunatics. We have to define exactly what we don't want the Ring to do, and balance it with what the Ring owners want.

If we act with hysteria and outrage without a plan, we will be brushed aside by history as the surveillance state is established.

A stastic that will never be reported

By Required Snark • Score: 3, Interesting • Thread
The number of false alarms resulting from Ring equipped locations.

Also, the racial data on false alarms will not be collected. Google and the police want 100% percent deniability about the racial profiling that underlies the marketing strategy and police record keeping. All the released information will reinforce the myth that policing is fair and "undesirable" kinds of people make most of the trouble. That is already the case with so called "predictive" policing software and Ring will just bring this to a mass market platform.

Re:Fridges and carports.

By rmdingler • Score: 5, Funny • Thread

Really? You never wanted a beer (or other cold beverage) during / after working outside, and didn't feel like taking off your dirty shoes first?

Damn. You're married AF.

Re:A stastic that will never be reported

By thesupraman • Score: 5, Insightful • Thread

WTF are you blathering about?

What does race have to do with privacy concerns of Ring? You think its cameras only pick up dark skin perhaps?

As to YOUR attempt to denounce a 'myth' that undesirable people make most of the trouble.. perhaps you should consider your use of words there, I'm pretty sure people who cause trouble would, almost by definition, be undesirable.
And yes, I know you were trying to play semantic games to imply things.. but dont be so childish. If you want to say 'Police unfairly target black people' than just say that, stop trying to hide behind words.

Ring doesnt actually give one shit about race, it is a surveillance platform, EVERYONE get surveilled. THATS what you should be caring about. Not trying to make this some kind of Racial BS.

Re: Mathematics, statistics, and downright fibs

By argStyopa • Score: 5, Insightful • Thread

1) the man was serving 15 years (!) on an aggravated kidnapping rap. That's murder-levels of time, btw.

2) he's likely murdered the corrections woman he was working with

I think we can fairly safely say he was fucking "dangerous"?

Further, I'm getting pretty tired of people misquoting the "trading liberty for security" thing. It's ESSENTIAL liberty, because anyone with a brain understands that society is founded on constant trading of small liberties for security. We don't drive wherever we want, stay between the lines, and obey the speed limit so the roads are generally safer. We accept laws constrain our actions, recognizing that those laws also generally protect us.

In turn, some lonely basement dwellers without family, without ties, without property, cry about how they've lost some ephemeral theorized "freedom" (hand waving when asked to explain precisely how).

Thanks, but the bulk of society doesn't care whether there's a camera watching them because there's nothing to see. "But what about....?" - I'll worry about that when it happens. Until then I'll be happy knowing that my family is one tiny bit safer, thanks.

'Who Owns Your Wireless Service? Crooks Do'

Posted by EditorDavidView on SlashDotShareable Link
Long-time Slashdot reader trolman scared this scathing editorial by security researcher Brian Krebs: If you are somehow under the impression that you -- the customer -- are in control over the security, privacy and integrity of your mobile phone service, think again. And you'd be forgiven if you assumed the major wireless carriers or federal regulators had their hands firmly on the wheel. No, a series of recent court cases and unfortunate developments highlight the sad reality that the wireless industry today has all but ceded control over this vital national resource to cybercriminals, scammers, corrupt employees and plain old corporate greed...

Incessantly annoying and fraudulent robocalls. Corrupt wireless company employees taking hundreds of thousands of dollars in bribes to unlock and hijack mobile phone service. Wireless providers selling real-time customer location data, despite repeated promises to the contrary. A noticeable uptick in SIM-swapping attacks that lead to multi-million dollar cyberheists...

Is there any hope that lawmakers or regulators will do anything about these persistent problems? Gigi Sohn, a distinguished fellow at the Georgetown Institute for Technology Law and Policy, said the answer -- at least in this administration -- is probably a big "no."

"The takeaway here is the complete and total abdication of any oversight of the mobile wireless industry," Sohn told KrebsOnSecurity. "Our enforcement agencies aren't doing anything on these topics right now, and we have a complete and total breakdown of oversight of these incredibly powerful and important companies."

He scared it?

By PCM2 • Score: 4, Funny • Thread

How does one scare an editorial? Tell it print is dead?

No problems here

By quonset • Score: 3 • Thread

I open my flip phone, dial the number, and make my call. When I'm done, I close the lid and the call ends. As for those robocalls, if you're not on my list, I don't answer.

I must be doing something wrong because I don't have any of these mentioned issues.


By Kohath • Score: 3 • Thread

Four banks were robbed last year, so bank robbers control the banking industry.

It's a small world after all ...

By CaptainDork • Score: 3 • Thread

... what with monopolist corporations. There are three (3) influential parties in the United States:


And the vast majority of the first two have merged their ideologies into the third.

The current administration has been obsessed, since November 8, 2016, with November 3, 2020.

Take back control...

By aaarrrgggh • Score: 3 • Thread

Treat your phone as a dumb service and demote your carrier down the value chain. Use a persistent VPN and a SIP provider for telephone service. If you want to be ultra-paranoid, just use a portable hotspot from the carrier and a wifi-only tablet as your “phone.” And of course, stop using Facebook, Google, or Amazon products; exclude Apple if you want to continue the ultra-paranoid approach.

It is significant regression, but that is the downside of everything being integrated and “easy.”

My first step has been switching my primary phone number to SIP with an area code that has a low Chinese population to help stop that particular spam, and in a lower income neighborhood to try to address another. My family all communicates with Signal... which isn’t perfect but does check a number of the boxes.

Vintage 30-Year-Old Mac Resurrected As a Web Server

Posted by EditorDavidView on SlashDotShareable Link
Long-time Slashdot reader Huxley_Dunsany writes: After much work rebuilding and upgrading it, my Macintosh SE/30 from 1989 is now connected via Ethernet to the Web, and is hosting a simple website and old-style "guestbook." The site has been online for a few days (other than semi-frequent reboots of the system when it gets overloaded with requests), and has served nearly 20,000 visitors. For a machine with a 16MHz CPU and 68 megabytes of ram, it's held up remarkably well!

I'm basically inviting a "Slashdotting" of my old Mac, but I thought this project might bring a few smiles here. Enjoy!

"Awesome," wrote one visitor in the guestbook, adding "You should join a webring!"

Sorry guys!

By Huxley_Dunsany • Score: 5, Informative • Thread
Oh, of *course* the first time I ever make the front page of Slashdot (after 20 years of being a reader!), I've just landed on an out-of-state work trip, and thus cannot reboot my old Mac and bring it back online. Really sorry everyone! I swear the site was up and responding well a few days ago when I submitted this, and will be again when I get home again later this week.

68 MB

By JoeCommodore • Score: 3 • Thread

68 MB? Wow, that's a lot of RAM. (for a Mac SE/30 - really!)

Re:You've been Slashdotted!

By jwhyche • Score: 4 • Thread

I'm flipping through the news looking for a story where a old mac exploded after being used as a webserver.


By Voyager529 • Score: 5, Insightful • Thread

Question 1: Why?

To quote John F. Kennedy:
"We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard, because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept..."

It's 2019. For $50, I can buy a RasPi starter kit, spend an hour with my mom, walk her through flashing the card with DietPi, and she'll have a web server a hundred times more powerful than this one.
I can grab an Optiplex 3010 for $100, hand it to her with a copy of the Turnkey Linux LAMP appliance, and she'll have a working web server almost by accident.

The point isn't to be serving a web page. That's just the finish line.

This project appeals to the person who took it on because it's hard. A Mac from 1989 predates basically every readily accessible piece of handholding.
There sure as hell isn't any help from Apple about it. I'd love to watch a video of him taking this unit to an Apple Genius; it's quite possible the machine is older than them.
There's nothing on StackOverflow about it.
There are no Youtube tutorials for it.
There's nothing in most user forums (unless maybe there's a vintage computing forum).
There might be some Usenet posts about it, but that sort of an archaeological dig is a task unto itself. Manuals and textbooks are very unlikely to cover doing web serving.
Apache, nginx, and other common web servers are unlikely to compile for the CPU architecture by default.

This computer is undoubtedly less powerful than the router it's sitting behind. The fact that the submitter was able to get it to work is basically an indicator that, whatever procedures they did to get this machine running, are basically unique.

Question 2: Why is this a story?

...Because it's fairly unique. The number of stories about these sorts of projects (and being successful in a manner that everyone reading can see) is something I'd like to see increase. We've had 1,001 stories about politics, Ajit Pai, and Donald Trump over the past 2-3 years. I can find that sort of thing anywhere. This sort of thing? Where would I find that? I'm happy to have read such a story and have visited those pages.

Also, Slashdot is a community. I *might* agree that perhaps a more formalized standard about personal projects being submitted should exist. However, if a community member embarks on a quest to do something like this, and is successful? Hell yes!

My favorite computer

By realkiwi • Score: 3 • Thread

The SE/30 is on the top of the list of my favorite computers.

Landmark 2.80 Release of Open Source Blender 3D With Improved UI Now Available

Posted by EditorDavidView on SlashDotShareable Link
"In the 3D content creation space, where are lot of professional 3D software costs anywhere from 2K to 8K Dollars a license, people have always hoped that the free, open source 3D software Blender would some day be up to the job of replacing expensive commercial 3D software packages," writes Slashdot reader dryriver: This never happened, not because Blender didn't have good 3D features technically, but rather because the Blender Foundation simply did not listen to thousands of 3D artists screaming for a "more standard UI design" in Blender. Blender's eccentric GUI with reversed left-click-right-click conventions, keyboard shortcuts that don't match commercial software and other nastiness just didn't work for a lot of people.

After years of screaming, Blender finally got a much better and more familiar UI design in release 2.80, which can be downloaded here. Version 2.80 has many powerful features, but the standout feature is that after nearly 10 years of asking, 3D artists finally get a better, more standard, more sensible User Interface. This effectively means that for the first time, Blender can compete directly with expensive commercial 3D software made by industry leaders like Autodesk, Maxon, NewTek and SideFX.

Why the Blender Foundation took nearly a decade to revise the software's UI is anybody's guess.

Over 20 years later, I am still right.

By BigBlockMopar • Score: 4, Insightful • Thread

Why the Blender Foundation took nearly a decade to revise the software's UI is anybody's guess.

Fair enough.

The vast majority of artists I've worked with will struggle and complain rather than learning a single fucking thing about how to improve their situation. Also, most of them could never come up with the idea "Hey, we can just pay someone to fix it." as they're usually broke.

You know, the funny thing about people who BUY software is that they usually get heard by the developers. "Sure, we'll tack a McMansion onto this little outhouse called DOS, we know the foundation is shaky and the basement stinks really bad, but we'll call it Windows 10! That'll be $199 for your copy!"

About 20 years ago, I wrote a rant about how Linux isn't ready for the desktop. It still isn't.

The issue is with Open Source development. Somehow, the development model needs to combine the best of unified design from having (ugh...) a CEO with the best of the Open Source model of "I love to optimize the algorithms that render JPEGs, so that's where I VOLUNTEER".

Except for the containment of the Open Source community that Google provides in Android, Linux still isn't ready for general use outside of PVRs and elevator controllers.

It hurts me, as someone who has been a Linux fan and running it for well over 20 years now, that the general desktop user experience remains so fragmented and fraught with frustration. Do I have viruses and ransomware? When I accidentally launch Wine, maybe. Do I have licenses to deal with? No, because AutoCAD doesn't release a Linux version because it can't figure out whether to support Red Hat or SuSE or Debian or Gentoo or Ubuntu, and they know everyone will squawk if they still believe in a proprietary software business model. What I have is 1,500 different distributions out there, each with at least 6 live ISOs to download. The choices alone are death by a million papercuts. And I've been running Linux since 1996!

What would a Newbie feel, who just needs to browse the web? At first it was setting the IRQs on network adapters... which you usually had to do blind because until you got it right, your one and only computer was down. When Knoppix came along with its then-revolutionary hardware detection, you couldn't use it at work for troubleshooting - all the way down to tacky graphics and sound effects, it looked like Klaus was a horny 14-year-old. At the time, I had access to the main computer room at the 5th-biggest airport in the world, I could have used Knoppix, but I couldn't run Knoppix because it truly looked and sounded like it was written by a virgin.

User interface issues persist. I know Blender is not Linux, the user-interface and development assumptions point right back to something about the Open Source model, mindset, and methodology.

It's 2019. Computing is infrastructure, and has been for at least 30 years. It's time we have computers and software with reliability and consistency befitting of electrical, plumbing, and voice telephony. 99.999% uptimes. Consistent, standardized user interfaces. The HOT water is always on the left, the COLD water is always on the right. Mass adoption will follow, so many of the issues with Open Source software are not proprietary features or file formats but are self-inflicted stupid issues like K3B's "Probably a buffer underrun..." error message caused by no volunteer being motivated to fix what is easy to fix. (And don't send me hate telling me to start programming, I'm good at component-level hardware.)

We need to get this right, or it could well be time to enforce a National Computing Code, just like Building Code, Electrical Code, Plumbing Code, etc...

You're all wrong, and here's why:

By MindPrison • Score: 5, Interesting • Thread

Blender has been competent for the movie industry for YEARS now.

20 years ago, I got an education as a classic 2D animator, and I jumped on the 3D-studio (yes, before Max) bandwagon because it was the commercially used software back then, when 3D studio Max came out - it had some distinct advantages over the fresh new hobbyist software available at the time, it was EASY to use immediately. Blender had an horribly hard learning curve in comparison to 3D studio max. In max, you could literally build a city on the fly, you could learn the basic interface in 5-10 minutes (being any good in any 3d software, takes YEARS to master, but that's another story - that even applies today).

But saying that Blender wasn't ready, or wasn't efficient enough, is just based on lack of knowledge. When I used 3D studio max (and paid for even Animation Studio, which where purchased separately for the Max package back then), it was also full of bug (the infamous freeze bug, where horrible to work with, and workarounds where many), not to mention the crashes, Max had plenty of those - and files often became corrupted so you had to have an extensive backlog of files, and export the assets to NEW files so not to bring along the bugs which most artist had no idea what were the cause of or what caused them.

The biggest disadvantage for me as a Max customer, was that I was a single person company, when I visited the forums back then, the elite Max users already knew of so many modelling techniques that they knew how to work around most bugs, and avoid Max crashing. Even if you documented it thoroughly - the company behind Max would arrogantly blame the user all the time, and me as a user, I had to change operating system 3 times until I finally caved and purchased Microsoft Windows 2000 Professional (which at the time, where the recommended operating system for using Max properly). I even changed my PC 3 times, there was a math error in one of the Intel processors, so I had to use another one. It was not the right ram for your system, that was another excuse, no one of these improvements to my system fixed the constant animation freezes or the Application Error crashes.

However - a paid expensive upgrade to a newer version fixed most of the bugs, and introduced new ones, interestingly - Kinetix never admitted to the bugs, but fixed them in never versions, to endless frustration for us who just wanted to use the software and make our work - WORK.

I finally gave up on 3D, altogether, frustrated 3D artist who simply just hated working with 3D after that, 6 months later (well - you can't keep a true 3D artist away from 3D in the long run), I tried Blender.

Of course, Blender in its infancy came with bugs too, and I wrote to the author and main coder of the software (Ton Roosendaal), and he immediately fixed the bugs, took him like 3 hours to respond and patch the software, and it worked again as it should (at least for me). And that woke me up to a different world I've not known before, coders - and companies - admitting their flaws, and fixing them, taking pride in their software without a huge company to worry about lawsuits and whatnot, they could do that - and I was stumped and amazed at the same time.

They even had coders meetings called Blender Sunday meet (something like that), on IRC, where you and the coders could meet, discuss future features and real life bugs, often these would be fixed the very next day after being reported, coders where enthusiasts users of the software too and had no sign of any arrogance in them, just pride in what they do.

Instead of "you're using it wrong" attitude - they had a different attitude, one that I've come to know from Mr. Roosendaal himself - if you make it crash, then we must have messed up somewhere. He and the coders worked with the users in locating the causes every time, tirelessly.

And for those saying that Blender weren't meant for a commercial environment - clearly don't know the history of Blender. Blender was made as an IN-HOUSE 3D tool for t


By Qbertino • Score: 3 • Thread

Blender has been gradually improving ever since it went FOSS ages ago. And changes and improvements to the UI have happened constantly. This is a larger step with the Evee renderer added as default and the UI improved once again but the improvements have been coming in continuously. Blenders workflow has been one of the fastest since eons, and it's workplace management and gl rendered ui second to none. Everyone who knows blender and other toolkits knows this. Blender needn't stick up to someone used to toolkit x . 3D kits are hard to learn, that's a cold hard fact, and if you're good in one doesn't mean you know the rest. Maya,3ds max and Houdini are deeply entrenched in the industry and many functions of those are way shittier to use than blenders equivalent and have been for quite some time now. The thing still holding blender back ( if you want to call it that) is industry pipeline integration and native renderman support. And that has mattered less and less in recent years as blender already is in use in the industry now.

Bottom line: yes, 2.80 is a huge deal, but so was 2.79 and the ones before. Ragging on about blenders ui fails to see the innovations blender has brought to the field.

My 2 cents.

Re:The biggest problem I've had with Blender

By Tough Love • Score: 5, Interesting • Thread

There are deep mathematical reasons why subdivision surfaces don't converge to perfect spheres, for all the popular subdivision schemes. However there is a relatively simple fix: generate a perfect sphere by other means, for example by inflating a cube with subdivided faces, then subdivide the resulting mesh. This will give you a miniscule error factor vs a true sphere.

Re:worth a try

By MindPrison • Score: 5, Interesting • Thread

IMHO it still lacks a lot in the "I can go and produce something" department, the biggest thing being a materials library. Every professional tool comes with a library of a couple hundred materials now. I see there are a couple add-ons for Blender, but no library comes with the default install, and the way to get materials in is less than intuitive.

Well, there are perfectly good reasons for this. The Blender software, albeit used commercially - is not sold commercially. Texture packs are often licensed from stock photography services and serious photographers. It would be fatal for a non-profit organization like Blender, to perhaps later when a certain texture has become popular, to be sued for millions of dollars in unclaimed profits or benefits from using "insert-someone-really-greedy-here" photo or texture as a part of a production somewhere, Blender org. would actually be responsible for that, and it's not worth the hassle or expenses.

Furthermore, adding textures and working with them, has never been easier. Blender has for many years now featured "Drag-and-Drop" support, which means as long as you've prepared your object for texturing, you can literally drag and drop the texture from your own library directly onto the selected polygon or mesh in Blender. Same goes for movies and many other things, drag and drop is now supported.

And we've added a very interesting new shader for your easy texture needs - Principled Shader! Select this texture next time you want to mess around with textures, you'll be amazed to see it has things you before had to (in any 3D software) to mix with glossiness, roughness, skin, SSS and much more, all-in-one shader. No longer any need to mess around with the Node editor to get the material mix you want, you can still do this for crazy advanced texture combos, but we've made it so much easier for the artists now.

Researchers Find More Than 40 Vulnerable Windows Device Drivers

Posted by EditorDavidView on SlashDotShareable Link
Artem S. Tashkinov writes: Researchers from security company Eclypsium have discovered that more than forty drivers from at least twenty different vendors -- including every major BIOS vendor, as well as hardware vendors like ASUS, Toshiba, NVIDIA, and Huawei -- include critical vulnerabilities allowing an escalation of privileges to full system level access.

Considering how widespread these drivers are, and the fact that they are digitally signed by Microsoft, they allow an attacker to more successfully penetrate target systems and networks, as well as remain hidden. Also while some of these drivers "are designed to update firmware, the driver is providing not only the necessary privileges, but also the mechanism to make changes" which means the attacker can gain a permanent foothold. Eclypsium has already notified Microsoft about the issues and at least NVIDIA has already released fixed drivers.

Who would have thought...

By blahplusplus • Score: 4, Interesting • Thread

... making drivers and device apps require login in order to function or send spying data back to companies might cause security risks. We really need some laws against privacy invading software in the OS and drivers.

This software as a service bullshit has gotten way out of hand. This idea we don't own what we buy because of bs IP laws bribed into being by large software corporations before the internet was a thing need to go.

Re:More than 40?

By Cipheron • Score: 4, Insightful • Thread

This one group *found* new vulnerabilities in 40+ drivers. That's very different to saying that only those 40 drivers have any vulnerabilities.

Personally, I have discovered zero vulnerabilities in any drivers, but I wouldn't conclude from that that all drivers are vulnerability free, because I'm just one guy and I'm not exactly looking for them. From just the summary we don't know how big the group is, how long they looked, or how many total drivers they looked at.

Bad Reporting

By LordWabbit2 • Score: 3 • Thread
A lot of the reporting I am seeing in mainstream media is blaming Windows 10, glad to see Slashdot got it right and is blaming the third party vendors who wrote shit drivers.


By toonces33 • Score: 3 • Thread

More updates. Something else to look forward to.

windows' weak spot

By sad_ • Score: 4, Insightful • Thread

is this a surprise to anybody?
windows drivers have always been the weak spot of the windows system, causing many bsod.
if the drivers are already so bad stability wise, why would you think their security design would be any better?

Facial Recognition Deployed on Children at Hundreds of US Summer Camps

Posted by EditorDavidView on SlashDotShareable Link
The Washington Post describes a parent whose phone "rings 10 times a day with notifications from the summer camp's facial-recognition service, which alerts him whenever one of his girls is photographed enjoying their newfound independence."

Cory Doctorow reports: You can also call your kid if you think they look unhappy or if you are unsatisfied with them in any way and nag them. So kids mob photographers with big, fake smiles and beg to be photographed so their parents won't harass them.

The companies have "privacy policies" that grossly overreach, giving them perpetual licenses to distribute all the photos they take forever, for any purpose. They claim to have super-secure data-centers, but won't describe what makes them so sure their data centers are more secure than, say, the NSA's, Equifax, or any of the other "super secure" data centers that have been breached and dumped in recent memory.

And while parents enjoy all this looking at their kids while they're away in theory, they also report a kind of free-floating anxiety because they know just enough about their kids' lives at camp to worry, but not enough to assuage their worries.

One overseer of two camps tells the Post that more concerned parents call her in two hours than used to call in an entire month. One company adds that their service is now being used by over 160,000 parents -- and for children as young as six.

At least one camp takes over 1,000 photos each day -- scanning each one with facial recognition technology -- and the Post reports that facial-recognition technology has now already been deployed at "hundreds" of summer camps all across the United States.


By gtall • Score: 5, Insightful • Thread

Congratulations! Now yer kid will know (a) you are a sneak, (b) she cannot trust you, and (c) if she wanted to get away with something, she just needs to avoid the cameras...her friends will help her with that.

Re:Does it get much more creepy than this?

By sjames • Score: 4, Informative • Thread

What crime? According to treaties signed by the U.S., they have a right to present themselves at the border and request asylum. Since they are acting within their rights, no crime is being committed by them.

They are not in jail awaiting trial for crimes, they are (in theory) in a waiting area awaiting a hearing on their immigration status. Given that, no claims about citizen prisoners also being separated from their children actually apply.

Re:Does it get much more creepy than this?

By swillden • Score: 5, Informative • Thread

Everyone that breaks the law gets separated from their children, citizens included.

Except the vast majority of the people being separated from their children at the border haven't committed a crime. They're applying for asylum which is perfectly legal under US law.

Criminals go to jail, innocents do not.

That's false. The very best you can say is that more criminals go to jail than innocents. I'm not saying that we shouldn't prosecute crimes because some innocents will be erroneously convicted. Fighting crime is important. But don't ever forget that not everyone who is convicted is guilty -- and, obviously, not everyone with a clean record is innocent.

Re:Does it get much more creepy than this?

By drinkypoo • Score: 5, Insightful • Thread

The situation is terrible in multiple ways and I do hate that it is happening, but the real moral problem are folks like you. You do not support the US going and ending the conflicts these people are running from

In many cases the US is already involved in these conflicts, and not in a positive way. But surely more US involvement will help!


By arglebargle_xiv • Score: 4, Interesting • Thread

So it's an endless stream of photos of prepubsecent children being pushed out to anyone who claims to be a parent.

I can see a quick way to get this crap shut down in an instant...

Middle-Aged Hearing Loss Doubles Risk of Dementia

Posted by EditorDavidView on SlashDotShareable Link
"Hearing loss in middle age is associated with higher odds of cognitive decline and dementia in later years," reports Reuters, citing a large study in Taiwan. Researchers tracked more than 16,000 men and women and found that a new diagnosis of hearing loss between ages 45 and 65 more than doubled the odds of a dementia diagnosis in the next dozen years. Even mild levels of hearing loss could be a risk factor, so hearing protection, screening and hearing aids may be important means of reducing cognitive risk as well, the study team writes in JAMA Network Open.

"Hearing loss is a potential reversible risk factor for dementia, including Alzheimer's disease," said senior study author Charles Tzu-Chi Lee of National Taiwan Normal University in Taipei.

Past research suggests that about two thirds of the risk for dementia is hereditary or genetic, which means about one third of the risk is from things that are modifiable, Lee noted. Among modifiable risk factors, hearing loss accounts for about 9% of dementia risk, a greater proportion than factors like hypertension, obesity, depression, diabetes and smoking. "The early identification of hearing loss ... and successful hearing rehabilitation can mitigate the negative effects of hearing loss," Lee told Reuters Health by email.

Re:Causal or coincidental?

By gweihir • Score: 5, Insightful • Thread

Can also be causal the other way round (people that get dementia later were not careful with their hearing) or can both be causal result of a third factor.

Statistics are tricky in a lot of ways. The worst is the interpretation at the end and deriving any sort of advice that is not bogus or even harmful.

Most important sentence in the article

By 93 Escort Wagon • Score: 3 • Thread

"The study was not designed to determine how hearing loss might contribute to dementia, or if the two conditions share the same cause."

In other words, the conclusions being highlighted here - how preventing hearing loss may prevent dementia - may very well have no basis in fact.


By thegarbz • Score: 4, Informative • Thread

Is not causation. Correlation is not causation.

You were so quick to try and get first post that you didn't even realise the only one who has so far proposed this causal relationship was you.


By deviated_prevert • Score: 5, Interesting • Thread

Is not causation. Correlation is not causation.

So people with hearing loss go on to develop dementia. That does not mean that the hearing loss is causing dementia!

No hearing loss does not by itself cause dementia but hearing loss can cause changes in the severity of the onset of advanced dementia. I cook in a seniors facility and have for many years dealt with people with differing levels of dementia. Dementia can be both a physical and psychological affliction and hearing is one of the key symptoms. Certainly those who also have hearing loss also progress more rapidly down the scale of mental acuity and interactive response. The human mind will hear what it wants to hear, that is natural. Paranoia is much more common and severe in those with hearing loss and early onset dementia.

One really good example is a wonderful woman that is currently in the facility where I work. She is not stable enough mentally to be able to do things on her own and can wander off if not watched all the time. Her hearing is still very good and she sings constantly and smiles and is a joy to have around. On the other side of the scale there are those who cannot understand what you say at all because of hearing loss, their countenance with the same level of dementia is much different and they tend to react much differently when redirected to safety when they wander. They also do not live long on average and tend to progress more rapidly down the scale of physical dementia much more quickly that people who still have adequate hearing.

So from evidence and observation it is obvious that hearing loss effects dementia and the loss of the use of centers of the brain that process sound effects the speed at which clinical dementia progresses.

I'm hearing impaired ...

By CaptainDork • Score: 4, Interesting • Thread

... and even with hearing aids my comprehension rate is 68%. That means I can hear conversation but I misunderstand the words.

I'm 73 now, but going back to when I was middle age, before hearing aids could help, I often came across as demented.

I joked that I could read lips if people wrote big enough. Anyway, I often sat with my eyes glazed over as people talked around me because I could not hear. The optics were not in my favour.

When a couple or group had their backs turned away from me, I would perceive that there was a break in the convo and I could get a word in. I got chastised for rudely interrupting.

My boss asked me when I was going to get hearing aids and I jokingly said, "When you have $1,700 worth of shit to say."

I did spend the money in 1995 when I was (let's see, carry the 1 ...) 50, because I was in a conference with out of town execs and all eyes turned to me.

I'm like blank.

My manager told the CEO-looking suit that, "He didn't hear the question."

Not a great career move, eh?

As an IT guy I could perform miracles of a semi-religious nature and as an electronics technician, I was so good that I could synchronize random noise and stuff.

Anyway, it wasn't dementia.

Another Google Service Closes: Texts with Voicemail Transcripts

Posted by EditorDavidView on SlashDotShareable Link
Long-time Slashdot reader freelunch reports that Google Voice "has announced via email that they are ending one of their most popular features -- sending transcripts of voice mails via text message. The cited reason is carrier message blocking."

From Google's email: It has come to our attention that certain carriers are blocking the delivery of these messages because they are automated and, at times, contain transcripts that resulted from unsolicited robocalls.

We can no longer ensure these messages will be delivered, so unfortunately we are turning down the feature. We have been slowly rolling out these changes and expect them to be fully deployed by 9 August 2019. No action is needed on your part.

However, the Get voicemail via email continues to be supported. As an alternative, the Google Voice iOS, Android and web apps can always be used to check voicemail and view transcripts.

Re:On the bright side...

By kiphat • Score: 5, Informative • Thread
They're only getting rid of VM transcriptions via SMS. They will continue to provide VM transcriptions via E-Mail.

Re:On the bright side...

By stephanruby • Score: 5, Informative • Thread

They are just ending the service that does it for you.

Actually, don't let the click-bait fool you, it will still work for you, just not via SMS.

The transcription service will still work, but it will be sent by email instead of SMS. And I assume it will still work over the web interface and over the Google Voice app as well.

Certain carriers

By bl968 • Score: 3 • Thread

Are most likely offering a competing service and are doing their best to obstruct Google's service. T-Mobiles Visual Voicemail service for example.

DARPA Hopes To Develop an AI Tool That Can Detect Deepfakes

Posted by EditorDavidView on SlashDotShareable Link
America's Defense Department "is looking to build tools that can quickly detect deepfakes and other manipulated media amid the growing threat of 'large-scale, automated disinformation attacks,'" reports Nextgov: The Defense Advanced Research Projects Agency on Tuesday announced it would host a proposers day for an upcoming initiative focused on curbing the spread of malicious deepfakes, shockingly realistic but forged images, audio and videos generated by artificial intelligence. Under the Semantic Forensics program, or SemaFor, researchers aim to help computers use common sense and logical reasoning to detect manipulated media.

As global adversaries enhance their technological capabilities, deepfakes and other advanced disinformation tactics are becoming a top concern for the national security community... Industry has started developing tech that use statistical methods to determine if a video or image has been manipulated, but existing tools "are quickly becoming insufficient" as manipulation techniques continue to advance, according to DARPA. "Detection techniques that rely on statistical fingerprints can often be fooled with limited additional resources," officials said in a post on FedBizOpps...

Beyond simply detecting errors, officials also want the tools to attribute the media to different groups and determine whether the content was manipulated for nefarious purposes. Using that information, the tech would flag posts for human review. "A comprehensive suite of semantic inconsistency detectors would dramatically increase the burden on media falsifiers, requiring the creators of falsified media to get every semantic detail correct, while defenders only need to find one, or a very few, inconsistencies," DARPA officials said.

But that's easier said than done. Today, even the most advanced machine intelligence platforms have a tough time understanding the world beyond their training data.

Remember Autorun.inf Malware In Windows? Turns Out KDE Offers Something Similar

Posted by EditorDavidView on SlashDotShareable Link
Long-time Slashdot reader Artem S. Tashkinov writes: A security researcher has published proof-of-concept (PoC) code for a vulnerability in the KDE software framework. A fix is not available at the time of writing. The bug was discovered by Dominik "zer0pwn" Penner and impacts the KDE Frameworks package 5.60.0 and below. The KDE Frameworks software library is at the base of the KDE desktop environment v4 and v5 (Plasma), currently included with a large number of Linux distributions.

The vulnerability occurs because of the way the KDesktopFile class (part of KDE Frameworks) handles .desktop or .directory files. It was discovered that malicious .desktop and .directory files could be created that could be used to run malicious code on a user's computer. When a user opens the KDE file viewer to access the directory where these files are stored, the malicious code contained within the .desktop or .directory files executes without user interaction — such as running the file.

Zero user interaction is required to trigger code execution — all you have to do is to browse a directory with a malicious file using any of KDE file system browsing applications like Dolphin.

When ZDNet contacted KDE for a comment Tuesday, their spokesperson provided this response.

"We would appreciate if people would contact before releasing an exploit into the public, rather than the other way around, so that we can decide on a timeline together."

mostly doesn't matter

By iggymanz • Score: 3 • Thread

KDE is dying, distros are tossing it out, only one developer making daily commits last time I looked

not pining for the fpords, it's shuffled off the mortal coil

Absolutely. Take down Wikipedia with one packet

By raymorris • Score: 5, Interesting • Thread

You're absolutely positively right. A few years ago I discovered a way to take down Wikipedia and other major web sites by sending a single packet. You wouldn't think you could take down a site hosted on multiple clusters, but there it was.

The name of this vulnerability? Didn't need name. Didn't need a website. Just needed to be fixed. You won't find press articles about it, because it was handled properly.

I contacted the project responsible for the vulnerable software. Their security team looked into it and within a day or two a fix was ready. The fix was deployed on major international targets such as Wikipedia. Later that day the team coordinated with distro maintainers such as Florian at Redhat to get the fix prepped in the distro update channels. After the update had been widely available for about a day it was time to explain what was being fixed.

It garnered no attention for me, and as far as we know there were no victims, because we handled it right.

I did get one benefit. Coincidentally, I had a job interview scheduled. The interviewer asked me if I had any experience with Ubuntu systems. I asked them if they get the Ubuntu security emails and asked them to look at the Ubuntu email that had been sent out a few hours before. The email from Ubuntu security began "Ray Morris discovered ...". The interviewer didn't feel the need to ask anymore about my Ubuntu experience. :)

No news here

By Nabeel_co • Score: 5, Interesting • Thread


Listen, I don't think that OpenSource is bad, in fact I think it's really good.

But the community is not healthy. Many open source projects lack polish and direction making them unusable for anyone other than the people most familiar with the software.

Look at Audacity, VLC or almost any Linux distro. There is so much tribal knowledge, and so many layers of crap you have to go through. Because everyone wants to do it their way, there is no consensus, and no focus on polish or simplicity.

Audacity breaks every normal paradyme for an audio NLE, VLC is half stuck between being a terminal utility and a GUI media player, and don't even get me started on Linux, or the beloved Ubuntu, riddled with stupid bugs, like not being able to change your hostname during the installer without causing problems once it's installed, or having to dig through 3-4 layers of configuration files to change one network interface.

WHY? There is no need for any of that, but people just keep piling crap on top of crap, or breaking convention because they are too lazy or think it's too inconvenient for them to follow convention.

It's problematic and is the key reason why the masses don't use open source software regularly, or donate to open source projects.

Now, I know, I'm on Slashdot, and we love our open source software on here. But we're not the norm.

Normal people can't be bothered to dig through three layers of config files to change a hostname or set an IP address, normal. people will tap space and expect their NLE to toggle between play and pause, normal people would expect VLCs conversion functions that are in the GUI it actually work, and not have to resort to using the terminal to use VLC when really all it does is use ffmpeg anyways.

It's crazy and is holding our community back.

What does this have to do with KDE? Well, if everyone stuck to a well established convention, then we wouldn't have weird flaws like this, because we'd all be using the same well tested conventions that has everyone's eyes on them, not just a niche group that prefer one thing over the other.

It's like, open source is supposed to be about how people can work together to release and maintain something for free, because it's maintained by the same people who use it, and anyone can contribute.

But what it really shows is how fragmented everything is, and how no one works together, fragments and forks everything until the point where you have 20 ways to do the same thing, all of which suck, and none of which have overlapping features being a situation where you're always splitting the difference and never truly happy.

It's a shame, and we can and should be doing better.

Re:Not a bad guess. Doesn't happen to be right

By raymorris • Score: 5, Insightful • Thread

> I'm not convinced the current consensus on the latter benefits anyone but lazy vendors and those who want to capitalize on vulnerabilities

99% of those of us with any experience in the field are convinced.

One thing to keep in mind is that roughly 99.98% of attackers are script kiddies - they click to run a set of 10,000 prewritten exploits, they don't figure out any themselves. Those vast majority of bad guys exploit things AFTER public release often don't even know there is a new exploit added to the toolkit.

The immediate disclosure position depends on the argument that if there are a dozen people in the world who could theoretically find it, we should distribute an exploit to tens of thousands of bad guys.

Thanks, KDE. And Gnome

By OneHundredAndTen • Score: 4, Insightful • Thread
The problem with KDE and Gnome is that they are trying - and failing miserably - to elevate Linux in the desktop by imitating Windows as closely as they can. In this case, creating security weaknesses. With such desktop flagships, no wonder has Linux been spinning its wheels in the desktop, and chances are it will carry on doing so. Their saving grace: by attempting to out-Windows Windows, thus guaranteeing that Linux will not become the choice of the masses in the desktop, KDE and Gnome are allowing those of us who use other options in the Linux desktop to have a convenient and secure desktop system, while being, for the most part, disregarded by malware creators. Keep it up, KDE and Gnome! Linux in the desktop needs you, although not in the way you would like.

San Diego's Connected Streetlights Taught to Recognize Bicycles

Posted by EditorDavidView on SlashDotShareable Link
Last year the city of San Diego installed 3,200 smart streetlights, each one monitoring 36 x 54 meters of pavement. They originally used the data to time traffic signals -- but now Slashdot reader Tekla Perry summarizes a report from IEEE Spectrum: Developers for the City of San Diego spent months training its smart streetlights to recognize and count bicycles from just about any angle. The system is now monitoring bicycle traffic, but a few issues remain--figuring out how to distinguish between bicycles being ridden--and those doing the riding, like on a bike rack or thrown in a pickup truck.

The software has a similar problem with pedestrian-counting: When a convertible comes into view, it is counted as both a car and a pedestrian--the visible driver.

Do you mean "Traffic Lights"?

By Mr D from 63 • Score: 3 • Thread
Street lights are those things that light the streets.

Bad idea

By gurps_npc • Score: 5, Insightful • Thread

The claimed idea was to help planners decide where bike paths are needed. But they are going to put them in the exact wrong place - where people are already able to bike, rather than where people would LIKE to bike but can't.

During World Word II, the British carefully tracked where the returning aircraft were shot. Then they added armor to the places with LOWEST percent of shots.

Why? Because only the surviving planes returned. The areas totally missed where areas hard to shoot. The areas with high shots were non-essential and could survive the damage. The areas with a low number of shots meant it was possible to hit that area and if they got hit a lot, the plane would not make it back.

Similarly, the areas with high bike traffic may not need bike lanes. The areas with low bike traffic might be the exact areas where they need.

Not to mention the massive privacy invasion for minimal return.


By Misagon • Score: 3 • Thread

I think maybe the problem is that they are not properly pre-processing the data before feeding it into a neural network.

A car is big. A bike is not. Both are able to run at speed.
Pedestrians are small and slow.
You won't need a neural network to detect large and small shapes, and to attach velocities to them. Do that properly and you won't need a larger neural network, and you won't miscategorise too much. Also, you should make the system be future-proof for different types of bikes, motorcycles, segways and electric scooters, monowheels and whatever people may come up with.

Existing automatic systems that time traffic signals detect cars as big objects, and does not detect bicycles: Instead bicyclists can press a button in front of the traffic light to get priority. Sometimes the simple solution is the best.

BTW ... Even better would be if there were dedicated bike lanes. Biking in the midst of many cars is not fun.

virtuous use of totalitarian technology

By astrofurter • Score: 3 • Thread

Yay panopticon!

Yay ubiquitous mass surveillance!

Yay getting snooped by streetlights!

NO WAY this system will be turned to evil purposes - UNPOSSIBLE!

Ride your bike around the gulag!

Everything's gonna be okay, 'cuz Big Brother loves us all.

Ask Slashdot: How Will Abandonware Work With Today's DRM Locked Games?

Posted by EditorDavidView on SlashDotShareable Link
dryriver writes: Thousands of charmingly old-fashioned computer and console games from the 8-bit, 16-bit, MS-DOS era are easily re-playable today in a web browser -- many Abandonware websites now feature play-in-browser emulated games. Here is a video of 101 charming old MS-DOS games, most of which can be re-played on Abandonware websites across the internet in seconds.

But what about today's cloud-linked, DRM crippled games, which won't even work without Steam/Origin/UPlay, and many of which don't even allow you to host your own multiplayer servers anymore? How will we play them 20 years from now -- on what may be Android, Linux or other OSs -- when they are tethered into the cloud? And is writing a fully-working emulator for today's complex Windows/DirectX games even feasible?

How will Abandonware work 20 years from now?


By AntiSol • Score: 5, Insightful • Thread

It won't. 50 years from now historians will be lamenting the massive black hole of lost data and useless encrypted binary blobs that is the early 21st century's legacy. It'll be kind of like those lost Doctor Who episodes, only on a massive scale - imagine if 5% of the doctor who tapes weren't lost and the rest were gone forever. More like that. Interestingly, it'll be all the little indie titles that didn't have DRM that will still exist.

Maybe there will be careful reverse engineering projects for some of the most beloved and iconic games. I seem to recall that there's some exemption to DRM anti-circumvention laws for archival purposes, so perhaps some of those projects won't even be sued into oblivion and you'll be able to fire up a modified version of Borderlands or GTA5 that's actually playable. But there sure won't be a library of virtually every game from this era playable in your browser.

I'm glad you've had this thought occur to you. After all, we've only been trying to tell you this for the best part of 20 years now ("we" being "those crazy conspiracy theorist paranoid freak anti-drm nerds who really just want to pirate stuff and are totally evil and definitely don't have any valid points"). Welcome!


By nospam007 • Score: 3 • Thread

Not at all!
That was kind of the point.

Same as all cloud stuff...

By DidgetMaster • Score: 4, Informative • Thread is gone when the cloud provider quits providing the service. If they go bankrupt. If the development team loses interest. If they just get greedy and cut you off if you decide not to pay their new extortion rates. Your data becomes inaccessible. Your service goes away. Period.

Re:same as always

By lgw • Score: 5, Interesting • Thread

Many old games, with some effort, are still playable with a few hacks - not cracks.

Bullshit. DRM, which used to be called "copy protection" has been around as long as commercial games. GOG cracks the game as part of preparing it for sale on their site. On a good day, the source code is still around and they can just remove the DRM and re-build.

I've heard the for Diablo 1, they went to even more heroic lengths than usual, as the source code had been lost, and some changes were needed to even run on modern systems. They recreated source from object, and went from there. I hope they at least had debug symbols from the retail build.

I've been involved with projects like that, back in the day: you have the legal rights, you have the object code, but you don't have the source code. It's a lot of "fun".

That is a good reason to buy game on GOG. It is, in general, a way to help make sure that if game become unplayable, it'll mostly be due to technical issues pertaining to the game itself, not a stuck digital right management system. You can hope to be able to play the Witcher 3 in 20 years.

Bad example, as GOG and CD Project Red are part of the same conglomerate, and don't like DRM. The awesome thing about GOG is that games with nasty DRM on other platforms are clean on GOG.

Why I got out of the industry

By dcollins • Score: 3 • Thread

I worked in computer games for some years in the late 90's. This was a large part of my thinking that prompted me to quit the industry permanently (not the only reason, but when I think back on it nowadays it's the first thing that came to mind).

Note that the positive examples in the OP are all MS-DOS games. My industry experience was in the transition to Windows 95 native. I worked on a triple-A racing title that used fairly specialized low-level code; to my knowledge that game was unplayable after about 10 years. The second company I worked at was an all-online digital collectible game company (MTG wannabe), and this was very much on my mind. Spending my life's work on something so incredibly ephemeral that it almost immediately disappeared like a ghost of history troubled me greatly.

Arguably games that could be archived and emulated/played forever was a teeny-tiny, abnormal slice of computer game history right at the start. Maybe I've got old-man disease that my expectation for ownership and archiving should be anything different.

Antitrust Issues? Amazon Pressured Sellers Offering Cheaper Prices on

Posted by EditorDavidView on SlashDotShareable Link
"Amazon's determination to offer shoppers the best deals is prompting merchants selling products on its marketplace to raise their prices on competing websites," reports Bloomberg: Amazon constantly scans rivals' prices to see if they're lower. When it discovers a product is cheaper on, say,, Amazon alerts the company selling the item and then makes the product harder to find and buy on its own marketplace -- effectively penalizing the merchant. In many cases, the merchant opts to raise the price on the rival site rather than risk losing sales on Amazon.

Pricing alerts reviewed by Bloomberg show Amazon doesn't explicitly tell sellers to raise prices on other sites, and the goal may be to push them to lower their prices on Amazon. But in interviews, merchants say they're so hemmed in by rising costs levied by Amazon and reliant on sales on its marketplace, that they're more likely to raise their prices elsewhere.

Antitrust experts say the Amazon policy is likely to attract scrutiny from Congress and the Federal Trade Commission, which recently took over jurisdiction of the Seattle-based company.

An analyst specializing in antitrust litigation tells Bloomberg that the policy "could end up being considered illegal conduct because people who prefer to shop on Walmart end up having to pay a higher price."

Retail too

By spinitch • Score: 3, Interesting • Thread
Retailers command a premium for displaying products in front of store. FMCG analyzes shelf location. Seems logical Amazon would offer a premium placement of buy now for items at lowest price, less likely to be returned and higher consumer satisfaction. This is a nice tip that an item without the buy-now might be better purchased elsewhere such as Walmart.

Re:Walmart ain't exactly up for sainthood.

By deviated_prevert • Score: 5, Insightful • Thread

Right, it's like when a shark eats all the little fish and then you feel sorry for the little fish. But then a bigger shark comes along and eats the smaller shark and now all of a sudden you have to feel sorry for the little shark. It's circle of life. Except that not really, it's actually cancer-capitalism and it's extremely damaging to society. There's a point where capitalism starts feeding on the consumers who are supposed to regulate the market.

I am not saying that Amazon is any more sainted than Walmart. Especially when the Chinese are dragging kids out of school to fill orders for Alexa based devices. The point is that the move to unrestrained predatory capitalism is destroying any possible sense of value in the market place. The rabid consumerism played upon by unrestrained predatory capitalism is reeking havoc with much more than the blue collar work force. It is using up planetary resources at an ever increasing rate with no thought whatsoever for the long term future of our children. There is no real intelligence in the way we engage in commerce as a species. Dealing with humans as nothing more than consumers that must consume at an increased rate to have continuous "economic growth" is systemically fundamentally flawed and eventually doomed to failure for obvious reasons. Anyone with half a brain can see this, certainly predatory capitalists do, they rule because we are fundamentally screwed in the brain by the neurotic consumerism of today.

Re: Bullying

By orlanz • Score: 4, Interesting • Thread

Why not consider this from a customer view point?

Amazon is helping me avoid a higher price. If Amazon allowed it, can you consider the cost to goodwill I would have of them and of the returns process or of the call-in support for a match when I find the cheaper price else where?

By doing this, Amazon is helping me save time and money at their own cost. It seems the merchant is trying to fleece me based on what door I come to them from.

You have to be a monopoly to abuse a monopoly.

By sjbe • Score: 4, Interesting • Thread

You don't need to be a monopoly to abuse monopolistic powers

That statement is a non-sequitur. You cannot abuse a monopoly without actually being a monopoly or at least an oligopoly.

Amazon definitely is the 800lb gorilla of online retail. Nobody else is even close but even there they aren't a monopoly - they have something like 40-50% market share at best. Impressive to be sure but hard to argue that is a monopoly in any sense. They are not even close to that status in the overall retail market - they are a single digit percent of retail. Walmart is 3-4X the size of Amazon by retail revenue and there are multiple other retailers that are of comparable size to Amazon in that space.

Re:Come on, what a joke

By Rick Zeman • Score: 5, Insightful • Thread

Yep, and Walmart continually forced vendors to lower their prices, providing templates to outsource to China to lower manufacturing costs, and exacerbated the race to the bottom.
Amazon is just a variation on their theme. Fundamentally, they both suck, and both are mostly unavoidable in the US in 2019.

MacGyvering Mars: How NASA's Curiosity Team Worked Around A Broken Drill

Posted by EditorDavidView on SlashDotShareable Link
As of Tuesday the Curiosity rover has been on Mars for over seven years, and this week NASA shared an interactive 360-degree panorama of the planet's Teal Ridge.

Digital Trends provides this update: Curiosity is halfway along its path through a region called the "clay-bearing unit" because the area has a high level of clay minerals. Clay minerals are of particular interest to scientists because they form in the presence of water, suggesting that there used to be water in this location thousands of years ago... The engineers estimate that the rover still has several years of power left in its nuclear power system, and will be able to continue operating beyond that with careful power budgeting.
"This nuclear power source, by the way, means that Curiosity is better equipped to handle monster Mars dust storms, such as the one that killed NASA's solar-powered Opportunity rover last year," reports, sharing more highlights from the years since Curiosity's touchdown: [T]he rover quickly determined that the 96-mile-wide (154 kilometers) crater had hosted a lake-and-stream system in the ancient past. And further observations suggested that this environment was habitable for long stretches, perhaps hundreds of millions of years at a time. Curiosity has also detected several surges of methane in Gale Crater's air...

Curiosity may well live to welcome two more rovers to the Red Planet: NASA's Mars 2020 rover, whose design is based heavily on that of Curiosity, and the European-Russian ExoMars rover are both scheduled to touch down in February 2021.

Tablizer (Slashdot reader #95,088) shares a recent triumph that one NASA official says "represents months and months of work by our team." When an electric motor stalled inside Curiosity's drill, it left the rover unable to reliably extend and retract its drill bit. With the drill feed mechanism no longer reliably working, managers decided to keep the drill bit in its extended position. That raised concerns over the stability of the drill while in use because the prong-like extensions on each side of the bit will no longer be in contact with the rock. "We had to do a big pivot in the mission thinking about how we could drill without the feed motor," said Ashwin Vasavada, the Curiosity mission's project scientist at JPL, in a presentation to the Mars Exploration Program Analysis Group in April.

Controllers devised a way to use force applied by the robotic arm to null out forces generated by the drill, a role the arm was never designed to fill. Engineers used a replica of the Curiosity rover at JPL's "Mars Yard" to test out the new drilling techniques, and the rover drilled a test hole in a rock on Mars in February. That test did not produce a scientifically useful rock sample -- it used only the drill's rotary mechanism, not its hammer-like percussion capability -- but yielded important data for engineers to continue refining the updated drilling technique.

And thanks to this ongoing improvisation, the Curiosity mission's project scientist says, "We now have a key sample we might have never gotten."