Alterslash

the unofficial Slashdot digest

Google Search Finally Adds Information About Video Games

Posted by timothy in Search • View
An anonymous reader writes Google has expanded its search engine with the capability to recognize video games. If your query references a game, a new Knowledge Graph panel on the right-hand side of Google's search results page will offer more information, including the series it belongs to, initial release date, supported platforms, developers, publishers, designers, and even review scores. Google spokesperson: "With today's update, you can ask questions about video games, and (while there will be ones we don't cover) you'll get answers for console and PC games as well as the most popular mobile apps."

I don't like

By NotInHere • Score: 3 • Thread

the current trend of google to create a "smart search" that directly answers your questions. Not because this isn't useful, but because projects like wikipedia suffer from it. This is even a direct competitor to wikidata. I still don't understand why wikidata isn't copyleft, its a bad descision in my eyes. Or isn't there any copyright on databases? Then i'll look forward for open google scraping projects.

Peter Kuran:Visual Effects Artist and Atomic Bomb Archivist

Posted by timothy in Technology • View
Lasrick links to this interview with Peter Kuran, an animator of the original Star Wars and legendary visual effects artist, writing If you saw the recent remake of Godzilla, you saw stock footage from Atom Central, known on YouTube as 'the atomic bomb channel.' Atom Central is the brainchild of Kuran, who among his many talents is an expert on archival films of the atmospheric testing era of 1945 to 1963. Combining his film restoration and photography expertise with his interest in nuclear history, he has also produced and directed five documentaries. He is currently working with Lawrence Livermore and Los Alamos National Laboratories to preserve and catalog images from the bomb-testing era, and to produce a technical handbook that will help people understand these images and the techniques used to create them.

OwnCloud Dev Requests Removal From Ubuntu Repos Over Security Holes

Posted by timothy in Linux • View
operator_error notes a report that ownCloud developer Lukas Reschke has emailed the Ubuntu Devel mailing list to request that ownCloud (server) be removed from the Ubuntu repositories because it contains "multiple critical security bugs for which no fixes have been backported," through which an attacker could "gain complete control [of] the web server process." From the article: However, packages can't be removed from the Ubuntu repositories for an Ubuntu version that was already released, that's why the package was removed from Ubuntu 14.10 (2 days before its release) but it's still available in the Ubuntu 14.04 and 12.04 repositories (ownCloud 6.0.1 for Ubuntu 14.04 and ownCloud 5.0.4 for Ubuntu 12.04, while the latest ownCloud version is 7.0.2). Furthermore, the ownCloud package is in the universe repository and software in this repository "WILL NOT receive any review or updates from the Ubuntu security team" (you should see this if you take a look at your /etc/apt/sources.list file) so it's up to someone from the Ubuntu community to step up and fix it. "If nobody does that, then it unfortunately stays the way it is", says Marc Deslauriers, Security Tech Lead at Canonical. You can follow the discussion @ Ubuntu Devel mailing list. So, until (if) someone fixes this, if you're using ownCloud from the Ubuntu repositories, you should either remove it or upgrade to the latest ownCloud from its official repository, hosted by the openSUSE Build Service."

Why not allow the update into the repos?

By saloomy • Score: 3 • Thread
That seems like a lot of dick-measuring on the part of developers. Why wouldn't Canonical simply update the repository with patches that address known security vulnerabilities? Where is the years of support? When you update your package list, the developers of those packages should be able to post updates...

This is why Linux is not desktop ready... to many stubborn minds pushing their way.

Re: Packages can't be removed?

By Anonymous Coward • Score: 5, Insightful • Thread

The developer has fixed the code. They're not responsible for maintaining the repositories of every single distribution or there, that's the job of the package maintainer of the distribution. Problem is, the package maintainer hasn't done their job, the developer has raised concerns, and has asked for it to be pulled until they do their job. It's just irresponsible for the package maintainers to come back and say "we can't pull it, we're leaving it as is, and we're not patching it either".

Re: Packages can't be removed?

By Gaygirlie • Score: 4, Informative • Thread

No, they're responsible for maintaining their packages in every repository they wish to add their package to, though. If they want to be part of the Ubuntu repo, rather than hosting their own repository, they play by Ubuntu's rules. Don't like it? Run your own repository.

They do: http://software.opensuse.org/d...

They're not the ones maintaining the packages in Ubuntu's repos, that's Ubuntu-folks' own doing.

Re: Why not allow the update into the repos?

By fnj • Score: 5, Informative • Thread

I don't think our AC(s) have the slightest idea how real life works. Developers don't "want their packages included" in any specific distro. Developers develop. They put the stuff out there and continually modernize it. Distros pick and choose what versions of what packages they include in any given release at the time of release. That's when all major revs are frozen for the duration of use of that distro release. The whole rat's nest of apps and libraries has to work together, You can't just update one piece of it.

The alternative is a rolling release like Arch, where every package is continually updated to the latest. The downside to that is when, for example, Apache 2.2 gets updated to 2.4 your website stops working because they changed the details of the config file. Rolling is the way to go for desktop where you don't want million year old obsolete packages preventing you from getting anything done, but not so much for servers.

This is to help the clueless understand. Obviously you know how it works.

Clarification regarding backports

By Lukas Reschke • Score: 5, Informative • Thread

Lukas from ownCloud here (the one mentioned in that article). I have to say, that this quickly escalated in a way that I did certainly not intend to. However, I'd like to clarify one thing.

The article states "for which no fixes have been backported". With that I meant to refer to the Ubuntu packages and not Version 5 or 6. We still support ownCloud 5 for security patches and critical bugfixes and ownCloud 6 for bugfixes and security patches. This might have been unclear.

I sent this request to Ubuntu because we're very much concerned about our users. While some of us might know that using the "Universe" repository is not a that great idea for internet facing software, most people don't. Furthermore, I don't believe it's the responsibility of the developer to update packages in every single distribution out there. Especially with distributions such as Ubuntu you have to follow quite complex processes such as SRU which consumes a lot of time.
Additionally, some people in the comments seem to claim that "one developer of ownCloud is noted as maintainer for the Debian package". This entry is a legacy entry and as you can see in the changelog at http://metadata.ftp-master.deb... Thomas did last modify the packages at 11 Oct 2012.

We're always recommending to our users to use one of the supported installation methods such as owncloud.org/install where we even provide our own repositories for most distributions.

(Disclaimer: Opinions expressed in this post are solely my own and do not necessarily also express the views of the ownCloud project or my employer)

Microsoft Now Makes Money From Surface Line, Q1 Sales Reach Almost $1 Billion

Posted by timothy in Hardware • View
SmartAboutThings writes Microsoft has recently published its Q1 fiscal 2015 earnings report, disclosing that it has made $4.5 billion in net income on $23.20 billion in revenue. According to the report, revenue has increased by $4.67 billion, compared to $18.53 billion from the same period last year. However, net income has decreased 14 percent compared to last year's $5.24 billion mainly because of the $1.14 billion cost associated with the integration and restructuring expenses related to the Nokia acquisition.

But what's finally good news for the company is that the Surface gross margin was positive this quarter, which means the company finally starts making money on Surface sales. Microsoft didn't yet reveal Surface sales, but we know that Surface revenue was $908 million this quarter, up a massive 127 percent from the $400 million this time last year. However, if we assume that the average spent amount on the purchase of this year's Surface Pro 3 was around $1000, then we have less than 1 million units sold, which isn't that impressive, but it's a good start.

Gross margin?

By whoever57 • Score: 3 • Thread

But what's finally good news for the company is that the Surface gross margin was positive this quarter, which means the company finally starts making money on Surface sales.

I think that someone doesn't understand accounting very well. Thre are all kinds of real costs that don't get factored into the gross, so this report does not show whether or not Microsoft is actually making money on Surface sales. For example, all that advertising cost.

Re:Those bastards?

By ChunderDownunder • Score: 5, Funny • Thread

Ease up, Android will be one of the few Linux distros by 2016 not using systemd. :-)

Re: Did they make money on Surface?

By PopeRatzo • Score: 5, Interesting • Thread

I own a Surface Pro 2 and a Surface Pro 3, and use them for portable music production, live performance and field recording. They are by far the best system for such use. It's a tablet, with the touch screen (or stylus) except it can run a full version of ProTools with all the plug-ins and VSTi's you could possibly want. Full USB connectivity for audio interfaces, MIDI controllers and peripherals.

If they made a Macbook with a removable touchscreen, it would be close, but Apple seems more intent on having every pixel in the world. I remember when Apple really catered to musicians (except for their slow adoption of audio driver standards). Now, they cater to people watching cat videos. At the moment, there is no device close to the Surface Pro for this purpose. I don't believe this niche is enough to sustain the Surface Pro by itself, but I'm glad to have them right now. And I hope someone else out there is paying attention, which is why I post a comment just like this every time the Surface comes up on Slashdot.

Not that there's anything wrong with cat videos.

Re: Did they make money on Surface?

By PopeRatzo • Score: 4, Funny • Thread

"It's all about ethics in Slashdot comments!"

The corporate sector is where it will sell

By cyber-vandal • Score: 4, Insightful • Thread

A tablet running full Windows where you can connect seamlessly to Exchange and AD, run Office and other Windows only apps and their existing .NET devs can easily write apps for them. The org I work for is trialing them now and the initial feedback has been very positive.

I can see the previous company I worked for going for it in a big way too. They have a lot of field staff who have lots of data to capture.

Days After Shooting, Canada Proposes New Restrictions On and Offline

Posted by timothy in YRO • View
New submitter o_ferguson writes As Slashdot reported earlier this week, a lone shooter attacked the war memorial and parliament buildings in Ottawa, Canada on Wednesday. As many comments predicted, the national government has seized this as an opportunity to roll out considerable new regressive legislation, including measures designed to* increase data access for domestic intelligence services, institute a new form of extra-judicial detention, and, perhaps most troubling, criminalize some forms of religious and political speech online. As an example of the type of speech that could, in future, be grounds for prosecution, the article mentions that the killer's website featured "a black ISIS flag and rejoiced that 'disbelievers' will be consigned to the fires of Hell for eternity." A government MP offers the scant assurance that this legislation is not "trauma tainted," as it was drafted well prior to this week's instigating incidents. Needless to say, some internet observes remain, as always, highly skeptical of the manner in which events are being portrayed. (Please note that some articles may be partially paywalled unless opened in a private/incognito browser window.)

Re:Won't past constitutional challenge

By towermac • Score: 4, Funny • Thread

Wow. It's like a different language, but in English.

Fourth possibility...

By denzacar • Score: 5, Insightful • Thread

They've ALWAYS had a draft like that. And any excuse will do to try to push it through.
If there is no excuse, try to push it through anyway.

It's not a conspiracy. It's not a coincidence. They are not waiting for or furnishing events.
They see such events as INEVITABLE. It is a part of their view of reality. It is their life philosophy.
Their BELIEF SYSTEM.
They think they're the good guys.

And once you look around, you'll notice that in other groups of people as well.
Gun nuts really do believe that government is after their guns.
Rich people really do believe that poor people are all lazy.
Hippies really do believe that all people are good, just misunderstood.
Justin Bieber fans really believe that he can sing.

Re:Won'd past constitutional challenge

By davester666 • Score: 4, Interesting • Thread

"A government MP offers the scant assurance that this legislation is not "trauma tainted," as it was drafted well prior to this week's instigating incidents"

Of course it was drafted some time ago. Harper was just waiting for something like this to get a way to quickly get it passed into legislation without all that pesky complaining that he got last time he tried doing it.

Unfortunately, the opposition and the press are busy deifying the couple of soldiers [well, two soldiers and a glorified security guard at a cemetary] and Harper for being so courageous, for standing up to this terrorist, and not giving into fear, while fighting for Canadian freedoms.

Of course, Harper is wallowing in fear, greatly increasing security around himself, and leaping at the chance to be able to spy on more and more citizens, I mean, terrorists. Nevermind also giving up Canadian freedoms so that Harper can really give it to his wife tonight.

Our supreme court MIGHT overturn this legislation, but who's going to fund the couple million dollars in legal fee's challenging it?

Re:Shot in the back

By EvolutionInAction • Score: 5, Insightful • Thread

You are a fucking idiot.
The weapon was an old, outdated weapon. It was meant to look fancy for tourist pictures. He was unarmed. Because here in canada, we don't carry guns without cause.

Notice how this idiot shooter was using a shotgun? That's a shit weapon for a shooting spree like this. THAT is the consequence of our gun control. Hunting weapons are fine and widespread. Human killing weapons are restricted.

Disproportionate response

By Livius • Score: 3 • Thread

In Canada no-one is using the word 'terrorism' (except the usual suspects who would have pulled out the terrorism card no matter what). We honour a soldier who died in the line of duty, but this is a drug use issue, not a security issue.

Politicians who try to exploit fear will likely reveal themselves, and themselves alone, to be weak-minded cowards.

AT&T Locks Apple SIM Cards On New iPads

Posted by timothy in Apple • View
As reported by MacRumors, the unlocked, carrier-switchable SIM cards built into the newest iPads aren't necessarily so -- at least if you buy them from an AT&T store. Though the card comes from Apple with the ability to support (and be switched among with software, if a change is necessary) all major carriers, "AT&T is not supporting this interchangeability and is locking the SIM included with cellular models of the iPad Air 2 and Retina iPad mini 3 after it is used with an AT&T plan. ... AT&T appears to be the only participating carrier that is locking the Apple SIM to its network. T-Mobile's John Legere has indicated that T-Mobile's process does not lock a customer in to T-Mobile, which appears to be confirmed by Apple's support document, and Sprint's process also seems to leave the Apple SIM unlocked and able to be used with other carrier plans. Verizon, the fourth major carrier in the United States, did not opt to allow the Apple SIM to work with its network." The iPad itself can still be activated and used on other networks, but only after the installation of a new SIM.

Re: Non-story?

By R.Mo_Robert • Score: 5, Informative • Thread

AT&T will unlock if you call and ask. They want they oppertunity to try and keep your business before unlocking. Last I checked that's good business to try and keep your customer. That being said if you don't like it go with one of the carriers with significantly less LTE coverage.

This isn't about unlocking the device. All iPads are and have always been unlocked. This is about AT&T's decision to disable using the multi-carrier Apple SIM card (new with this iteration of iPads) on any carrier besides AT&T once you use it once with AT&T. (Does Apple even sell the Apple SIM card separately? Maybe in store, but it's certainly not on their website as of now. Your best bet would be just to get an AT&T SIM card if you want to use them and save the Apple card for cooperating carriers.)

Re: Legality

By khellendros1984 • Score: 5, Insightful • Thread
The whole point of the story is that the Apple SIM gets locked by AT&T to their network. The SIM is part of the hardware that you purchased with the iPad. Therefore, the hardware that belongs to you (the SIM) gets locked. Implying that it doesn't matter because the rest of the device remains free to use elsewhere is missing the point.

If AT&T wants to lock a SIM to their service, then they should provide the customer a SIM, rather than disabling functionality in the SIM that the customer already has. Putting it in the contract gives them a right to do it, but it doesn't make it a less-scummy business practice.

Re: Non-story?

By BronsCon • Score: 5, Insightful • Thread
This isn't about equipment purchased from AT&T, on or off contract, with an AT&T SIM, SIM-locked to AT&T's network. This is about equipment purchased from Apple, off contract, with an Apple SIM, not SIM-locked to any network. AT&T is locking that Apple SIM ( not the device) to AT&T's network, forcing you to buy another Apple SIM if you wish to switch carriers, something no other carrier is doing. A SIM card, once locked to a carrier, can not be unlocked.

Re: Non-story?

By zippthorne • Score: 4, Insightful • Thread

It is good business to try and retain a customer. It is terrible behavior to hold someone's property ransom to force them to listen to your pitch.

Depending on how quickly word gets out, and the reaction, the second may not be a productive way of trying to achieve the first.

Re:Go T-Mo

By Miamicanes • Score: 4, Interesting • Thread

What, exactly, does Verizon do that is so dishonest and earns them so much hate?

They lock down their phones, and in the past they've actively disabled features supported by their phones' hardware to force you to use their premium services (Bluetooth modes, Wifi, and GPS have all been casualties of Verizon's lockdown fetish in the past). Compounding matters, there are lots of semi-rural places where Verizon is the only carrier with viable service (or at least, viable service INDOORS). Verizon was also the only carrier who forced bootloader-locking up until AT&T joined the party last year.

That's why T-Mobile is the carrier everyone desperately wants to love, even in areas where their service is poor. They're the only carrier who DOESN'T lock down their phones & try to restrict what you can do with them.

Passwords: Too Much and Not Enough

Posted by Soulskill in Management • View
An anonymous reader writes: Sophos has a blog post up saying, "attempts to get users to choose passwords that will resist offline guessing, e.g., by composition policies, advice and strength meters, must largely be judged failures." They say a password must withstand 1,000,000 guesses to survive an online attack but 100,000,000,000,000 to have any hope against an offline one. "Not only is the difference between those two numbers mind-bogglingly large, there is no middle ground." "Passwords falling between the two thresholds offer no improvement in real-world security, they're just harder to remember." System administrators "should stop worrying about getting users to create strong passwords and should focus instead on properly securing password databases and detecting leaks when they happen."

Re:Why so high?

By gbjbaanb • Score: 5, Insightful • Thread

Its not about entering passwords on the web login page, but securing the back-end system so that the password database cannot be stolen.

I am constantly amazed at the reports that hackers have accessed the passwords of every user on some site or other. I used to work at a financial company where the web server didn't have physical connectivity to the DB, every request had to go through a service that was not only secured itself, but also could only run stored procedures which were in turn secured. The net result was that is (or rather when) the web site got hacked, all the hacker could do *at best* was access some public data for a single user, which never included the stored password. (incidentally, the developers didn't have access to production servers either)

So sites like ebay et al have direct DB connectivity to their DBs, so if any hacker exploits some zero day that gives them access to the OS, they can simple "select * from user" and download every password, hashed or not, and crack them at their leisure.

Personally, I think passwords should be stored in plain text in the DB as a reminder to all developers that they need to be protected in other ways so the hacker cannot access them in any circumstance.

However, what I do find strange is that web devs do not know this, I wrote the above for an ArsTechnica comment and the "security editor" promoted a criticism of it where the concept of a 3-tier architecture was "too expensive" an d"inefficient", suggesting that storing your DB credentials in your web code was OK as long as you "secured" it. If this is the level of comprehension of security in the web dev community, then I'm not only unsurprised at the number of hacks, but will be using a randomly-generated password for every website that asks me for a password.

Re: Passwords should not exist

By sexconker • Score: 4, Interesting • Thread

When you send things down a wire, everything is "something you know".
A smart card or an RSA clock or a code sent via SMS is effectively just another password. And while it may be a strong password that's hard for an attacker to know, changes with time, etc., it's still vulnerable to MITM attacks because you're sending your shit over a single, unsecured channel. It's also a password the user has little to no control over, can lose and not have a backup of, etc., so there are entire management, recovery schemes introduced to make them usable. They provide very little in terms of security over a strong password. They only fix 2 problems - weak passwords and keyloggers. But keyloggers are just a subset of compromised boxes, and if you're using a compromised box then you're susceptible to an active attacker MITMing you using your valid smart card / token / codes / etc.

For two-factor security to actually be "two-factor", you have to validate the 2 things separately and via different means. A bank can do this in person by verifying your account information/name/etc. and your photo ID by actually fucking looking at the ID and you. When you automate everything and shove it down a single pipe (the internet), it's all effectively just a password.

Re:Why are we still using passwords?

By gbjbaanb • Score: 4, Insightful • Thread

Because its wrong.

If you treat each word as a symbol, rather than each letter, then you find the average vocabulary is about 10000 symbols and you have just generated a 4-character password (admittedly in base-10000 rather than base-26) but you'll find its still easily crackable, especially if the hacker uses pre-generated rainbow tables.

Or to put it another way, your xkcd password, if the user has a vocabulary of 10k words, being cracked by a CPU that can manage a trillion hashes per second (easy) means your password can be brute forced in less than 3 hours. For reference, 16 random characters would take 2.5 billion years (ie 64^16 = 8*10^28, which is 8*10^16 trillion seconds, or 2512 million years). Ok probability says your chances are on average half that.. only 1.25 million years.

The best password is a random one, use a tool to generate and store them and let it type them into the password field. If you must use a xkcd style password, at least stick a digit between each word.

Re:Why so high?

By Altrag • Score: 5, Insightful • Thread

As long as your DB is connected to any network in any fashion, its susceptible to cracking. You can't possibly know what new attack vectors may arise. And even if you somehow manage to guarantee 100% security in your software (which is frankly unlikely to the point of impossible -- software is hard!) you still can't guarantee that some human participant won't either go rogue or screw up, allowing access to the DB that doesn't require _any_ technological cracking.

The only way to completely lock down a computer is to keep it shut off in a vault that literally nobody can open. But of course that also makes it no more useful than a rock.

As for web dev's comprehension of security.. again, software is hard. When they say its "secured" what they mean is "I can't think of a way in." But of course that's a totally irrelevant metric because they aren't the attacker.

And that fact applies to everyone, up to and including the most experienced security researcher in the world, because of the obvious fact that if they could think of a way in, they'd patch it. Its always the ones you don't think of that get you.

No matter how smart you are, there will be a maximum program size you hit before the complexity overwhelms you and you have to internally abstract things, and every abstraction is a potential security hole that you may or may not ever consider. Never mind relying on third party libraries or for that matter the security of the hardware layers.

We've definitely come up with some best practices, and its no secret that there are loads of people who don't know, don't care, or don't have the time to implement the best practices.. but all the best practices in the world don't solve the problem of program complexity exceeding (full) human comprehension.

Re: Passwords should not exist

By ShanghaiBill • Score: 4, Insightful • Thread

it's still vulnerable to MITM attacks

No. The smartcard is pre-programmed with the public key of the authenticator, and vice versa. Unless someone knows the private key of one of the endpoints, the authentication cannot be faked. A MITM attack will not work.

Verizon Injects Unique IDs Into HTTP Traffic

Posted by Soulskill in YRO • View
An anonymous reader writes: Verizon Wireless, the nation's largest wireless carrier, is now also a real-time data broker. According to a security researcher at Stanford, Big Red has been adding a unique identifier to web traffic. The purpose of the identifier is advertisement targeting, which is bad enough. But the design of the system also functions as a 'supercookie' for any website that a subscriber visits. "Any website can easily track a user, regardless of cookie blocking and other privacy protections. No relationship with Verizon is required. ...while Verizon offers privacy settings, they don’t prevent sending the X-UIDH header. All they do, seemingly, is prevent Verizon from selling information about a user." Just like they said they would.

Re:Is there a way to prevent this?

By DamnOregonian • Score: 4, Insightful • Thread
Not just sexual harassment. It's safer for a supermodel to walk down MLK in your favorite large city naked than a homely woman to walk from one end of Fort Hood to the other, wearing ACUs after dark.
When soldiering becomes less of a duty and more of a way to delay starting out your life of dismal poverty, you start making the wrong kind of army.

Don't use HTTP. Use HTTPS.

By jtara • Score: 3 • Thread

Don't want your carrier messing with your traffic?

Use HTTPS.

Hello Vodafone

By wabrandsma • Score: 5, Informative • Thread
From: Using Browser Properties for Fingerprinting Purposes.

Vodafone injects the X-VF-ACR header: 'Vodafone Anonymous Customer Recognition'. It is unclear what this header exactly does; all headers that have been seen start with the string "204004DYNMVFNLACR", followed by 16 X's, and are followed by a BASE64-encoded 256-byte cyphertext, which we were unable to decrypt. It has been suggested that this string might contain the SIM-card identifier (IMSI) or other personal information, as was found in a research conducted by Mulliner in 2010 [14]. Vodafone did not respond to requests of explaining this header. Nevertheless, the presence of this header, certainly identifies customers of Vodafone as being customers of Vodafone.

Re:Telling The Story Backwards and Upside Down.

By DamnOregonian • Score: 4, Informative • Thread
I have a good friend there right now. There have been 2 attempts on her where she had to physically fight someone off of her, and the first 2 days of reception were sexual assault awareness classes where they're instructed to stay out of the dark and not go anywhere on-base that they're not familiar with or get into any cars they're not familiar with. No shit. On a US army base.

Not all web sites offer HTTPS

By tepples • Score: 5, Insightful • Thread
And lose access to several websites. Slashdot, for example, redirects HTTPS hits to HTTP for non-subscribers because ad networks have been slow to implement HTTPS. And a lot of shared web hosts don't support HTTPS because their policies haven't been updated in the six months since the last major Server Name Indication-ignorant desktop web browser (IE on Windows XP) reached end of support in April. But HTTPS support is the second biggest reason I stopped going to TV Tropes in favor of All The Tropes (after licensing).

Secretive Funding Fuels Ongoing Net Neutrality Astroturfing Controversy

Posted by Soulskill in Technology • View
alphadogg writes: The contentious debate about net neutrality in the U.S. has sparked controversy over a lack of funding transparency for advocacy groups and think tanks, which critics say subverts the political process. News stories from a handful of publications in recent months have accused some think tanks and advocacy groups of "astroturfing" — quietly shilling for large broadband carriers. In a handful of cases, those criticisms appear to have some merit, although the term is so overused by people looking to discredit political opponents that it has nearly lost its original meaning. An IDG News Service investigation found that major groups opposing U.S. Federal Communications Commission reclassification and regulation of broadband as a public utility tend to be less transparent about their funding than the other side. Still, some big-name advocates of strong net neutrality rules also have limited transparency mechanisms in place.

ISPs v. Content Producers

By Etherwalk • Score: 3 • Thread

This is just ISPs v. Content Producers, each fighting over who can bribe Congress more. (Siding with content producers is basically everyone else who cares about the issue and has time or money to spend on it, which is probably less than 0.01% of everybody.)

Great

By Lunix Nutcase • Score: 3 • Thread

Still, some big-name advocates of strong net neutrality rules also have limited transparency mechanisms in place.

And who exactly are they and where is your proof of their limited transparency mechanism? Do you have actual specifics or simply vague FUD?

The saddest part is.....

By Dega704 • Score: 3 • Thread
Even with the misleading propaganda efforts, the public in general overwhelmingly supports Net Neutrality. If this issue were put to an actual vote, I have zero doubt that it would win by a landslide. I have yet to meet a single tech-savvy person that supports paid prioritization, even among conservatives. Sadly, that doesn't seem to matter. If it did, we would be some kind of democracy or something. Heaven forbid.

A Low Cost, Open Source Geiger Counter (Video)

Posted by Roblimo in Build new • View
Sawaiz Syed's LinkedIn page says he's a "Hardware Developer at GSU [Georgia State University], Department of Physics." That's a great workplace for someone who designs low cost radiation detectors that can be air-dropped into an area where there has been a nuclear accident (or a nuclear attack; or a nuclear terrorist act) and read remotely by a flying drone or a robot ground vehicle. This isn't Sawaiz's only project; it's just the one Timothy asked him about most at the recent Maker Faire Atlanta. ( Alternate Video Link)

Slashdot jumps the shark...

By creimer • Score: 4, Insightful • Thread
Submitting somebody's LinkedIn profile as a news story must be either a slow news day or a new low in qualitiy standards.

geiger counters vs. survey meters

By iggymanz • Score: 3 • Thread

Geiger counters are great for prospecting for uranium or looking for any residual contamination after being in a hot site. However, they will be easily overloaded in a nuclear disaster area and could even give a very low rad reading while you are getting a maiming or lethal dose. What you need is called a "survey meter', and they do NOT work on the same principles as a G-M tube. But I daresay this guy will need a different type of electronics to make a survey meter that could be dropped in, your normal SOC and microprocessors will go apeshit in a rad environment

Better solutions

By Okian Warrior • Score: 5, Informative • Thread

I've been building geiger counters as a hobby for the past couple of years. I was consulting with some people in Japan right after Fukishima helping to build reliable detectors.

Geiger Muller tubes require a specific "plateau" of voltage to get consistent results. Too low and you're not picking up much radiation, too high and you get spurious results and can burn out the tube. The correct voltage varies with individual tubes.

This isn't normally a problem, except that there's a glut of surplus Russian geiger tubes on the market right now with unknown provenance and unknown parameters. Unless you calibrate each tube to find the plateau voltage, and unless you calibrate the resulting counter with a known source, the data you get will have no predictive value.

It's straightforward for a hobbyist to put together a project using one of these tubes and get it to click in the presence of radiation, and this makes a fine project for electronics learning, but you have to take further steps to get a reliable instrument. No one ever does this. The circuits I've seen have an unregulated high-voltage proportional to the battery voltage - it gets lower over time as the battery runs down. The voltage is chosen from the tube spec sheet, instead of determining the correct voltage for the tube. Circuits have design flaws such as using zener diodes for regulation, but not allowing enough current through the diode for proper function. And so on.

I've seen lots of these hobbyist projects in the past few years, especially since Fukishima. They're fine projects and well-intentioned, but generally not of any practical use.

Does radiation detection(with actual accuracy, linearity, and repeatability, not just a quick demonstration that you can add some noise to a webcam by pointing a small sealed source at it) have currently good, or at least promising for the not too distant future, solid state options?

Virtually any semiconductor will detect radiation. What you want is a semiconductor with a large capture aperture(*), which is the area through which the radiation passes. A 2n2222 transistor will detect radiation quite well, but it's capture area is tiny and won't see much of the radiation (saw the top off of a metal-can version and use a charge amplifier).

Power transistors such as the 2n3055 have large silicon dies and therefore larger apertures - as much as a square centimeter - but this is also quite small for capture.

The modern equivalent is to use a big diode such as a PIN diode. These can be quite large, but also expensive for the hobbyist.

A GM tube has a capture area which is the cross sectional area of the tube. These can be made quite large; and as a result can be made quite sensitive to the amount of radiation flux in the area. Hobbyists can also make their own tubes with enormous capture areas - it's not very difficult.

Large diodes are available for detecting radiation, but a GM tube is simple and can be easily made with a very large capture aperture. Also, GM tube their capture efficiency (the percent of radiation that gets in which is is actually detected) can be higher than the diode solution.

(*) There's capture aperture and detection efficiency. GM tubes have an efficiency of about 10%, meaning that only 10% of the radiation that gets into the tube is detected. Diodes have similar efficiencies, depending on the photon energy and thickness of the silicon die.

Re:Better solutions

By Okian Warrior • Score: 4, Informative • Thread

Are there any issues with silicon solar cells that make them (protected against visible light, obviously) unsuitable? Compared to power silicon or anything for computation you can get enormous area for relatively little money.

Huh. I hadn't thought of that. A quick google search shows that solar cells can be used as radiation detectors, and they generally have large capture areas. I'll have to try this out.

This looks like a good background document for detecting radiation using semiconductors.

This is the type of amplifier you need as a 1st stage in your detector, should you want to build your own. (Google "Charge Amplifier" for more info.)

The radiation comes in as quick pulses (3 us or so in my circuits), so normal incident light shouldn't interfere with the detection. You could perhaps get both power and detection from the same cell.

I've been interested in detecting not only the radiation, but the direction it came from. A 3-d array of detectors with an incidence/correlation circuit can give a general idea of the direction of the source, relative to the detector. I haven't done this yet due to the complexity and expense of the detectors, but solar cells being cheap and easily available I might just try this out. Hmmm...

Thanks for the suggestion.

Computer Scientist Parachutes From 135,908 Feet, Breaking Record

Posted by Soulskill in News • View
An anonymous reader writes: The NY Times reports that Alan Eustace, a computer scientist and senior VP at Google, has successfully broken the record for highest freefall jump, set by Felix Baumgartner in 2012. "For a little over two hours, the balloon ascended at speeds up to 1,600 feet per minute to an altitude of 135,908 feet, more than 25 miles. Mr. Eustace dangled underneath in a specially designed spacesuit with an elaborate life-support system. He returned to earth just 15 minutes after starting his fall. ... Mr. Eustace cut himself loose from the balloon with the aid of a small explosive device and plummeted toward the earth at a speeds that peaked at more than 800 miles per hour, setting off a small sonic boom heard by observers on the ground. ... His technical team had designed a carbon-fiber attachment that kept him from becoming entangled in the main parachute before it opened. About four-and-a-half minutes into his flight, he opened the main parachute and glided to a landing 70 miles from the launch site."

Re:Baumgartner took too much credit

By Ancil • Score: 5, Insightful • Thread

I like this new one, seems to have been done for the right reasons.

What exactly are "the right reasons"?

I'm being serious. Is there some sort of "right" or "noble" reason to spend all this money jumping from slightly higher than the last guy who spent a lot of money?

Am I missing something here? Off the top of my head, the only reasons which come to mind are "extreme wealth" and "boredom".

Re:Being a computer scientist

By 93 Escort Wagon • Score: 5, Funny • Thread

it would have been cool if he would have jumped from 128,000 feet. ;)

That would only work if he was employed by Western Digital or Maxtor.

Re:Not to be outdone

By 93 Escort Wagon • Score: 5, Funny • Thread

Made even more of an awesome feat due to the fact that Mir had been de-orbited in 2001.

OP failed to mention that, to make the jump possible, Mr. Putin first plans to throw Mir back into orbit.

Guinness, not Geek

By Mister Liberty • Score: 4 • Thread

For the bizarre books that the former keep.
Nothing to do with geeks even if (and increasingly 'precisely because') it involves Google.

Re:Record

By Zynder • Score: 4, Funny • Thread
128,908 + 14,337j feet....

Researcher Finds Tor Exit Node Adding Malware To Downloads

Posted by Soulskill in Management • View
Trailrunner7 writes: A security researcher has identified a Tor exit node that was actively patching binaries users download, adding malware to the files dynamically. The discovery, experts say, highlights the danger of trusting files downloaded from unknown sources and the potential for attackers to abuse the trust users have in Tor and similar services. Josh Pitts of Leviathan Security Group ran across the misbehaving Tor exit node while performing some research on download servers that might be patching binaries during download through a man-in-the middle attack.

What Pitts found during his research is that an attacker with a MITM position can actively patch binaries–if not security updates–with his own code. In terms of defending against the sort of attack, Pitts suggested that encrypted download channels are the best option, both for users and site operators. "SSL/TLSis the only way to prevent this from happening. End-users may want to consider installing HTTPS Everywhere or similar plugins for their browser to help ensure their traffic is always encrypted," he said via email.

Re:Defaults

By lgw • Score: 5, Informative • Thread

Sorry, "HTTPS everywhere", not "-only" - it tries HTTPS first, which helps with a bunch of sites so you don't have to bookmark the https version specifically, but still falls back to HTTP when needed.

Everyone should use that plugin in normal browsing IMO - it will drive traffic to HTTPS, and really there's no reason for non-HTTPS sites anymore Slashdot are you listening, you HTTP-only weenies?

Re:Checksums

By bug1 • Score: 4, Insightful • Thread

What you need is a digital signature instead.

And make sure its signed by a large well known company that works at the government level. Then you are really safe !!!

Bitcoin users also MITM by exit nodes recently

By qubezz • Score: 3 • Thread

There have been several reports of Bitcoin users that use online wallets and exchanges, even over https, getting MITM attacked when using Tor. They visit the wallet site, get bad certificates but continue anyway, and poof, their Bitcoins in the service are gone and their passwords are known by the attacker. With recent SSL vulnerabilities or clever redirection, the cert errors could be avoided also. For other sites, users can be piped through a "universal phisher" to steal any credentials.

Clearly Tor users are under attack by exit nodes, many of them running automated tools against many web destinations.

Re:Bitcoin users also MITM by exit nodes recently

By NotInHere • Score: 5, Insightful • Thread

if you
1) use an online wallet
2) accept bad certs
you certainly live a risky life.

This is not really big news.

By MartinG • Score: 3 • Thread

Tor provides anonymity. It does not provide authenticity or secrecy, and doesn't pretend to. If you want those things, you should use something else in addition to tor. For example, TLS or SSH might suit your needs.

Employers Worried About Critical Thinking Skills

Posted by Soulskill in News • View
Nerval's Lobster writes: Every company needs employees who can analyze information effectively, discarding what's unnecessary and digging down into what's actually useful. But employers are getting a little bit worried that U.S. schools aren't teaching students the necessary critical-thinking skills to actually succeed once they hit the open marketplace. The Wall Street Journal talked with several companies about how they judge critical-thinking skills, a few of which ask candidates to submit to written tests to judge their problem-solving abilities. But that sidesteps the larger question: do schools need to shift their focus onto different teaching methods (i.e., downplaying the need for students to memorize lots of information), or is our educational pipeline just fine, thank you very much?

Maybe too much critical thinking is the problem

By Atrox666 • Score: 3 • Thread

Maybe those with critical thinking skills already figured out that corporate America is a sucker's game. ..sent from my cubicle.

Employees who can "just figure it out"

By QuasiEvil • Score: 3 • Thread

I'm not a manager-manager, but I am a technical manager and - at the end of the day - basically the guy who gets the hiring decision whenever I need more people.

I don't care about what you know beyond the basics, and I also don't care where (or if) you went to college or that your degree is even slightly related to what we're doing. The things I look for are that you have some talent with system design, architecture and programming, a passion for technology (aka, it's not just a 9-5 job thing, but you eat, live, and breathe it), and the capability to go learn and figure things out on your own. Along with the third thing, a general, broad set of knowledge is good, but as long as you can use Google or books or experiments to figure things out, I'm okay. I'd much rather you be able to learn and adapt.

You'd be amazed how many people fail at least #3. I don't want to hand-hold you or have to spoon feed you answers. Don't know? Go look it up. Go try something. Just don't come over and ask for help right away. If you've gotten stuck somewhere, I'll help, but you damn well better have beaten your head against the wall for a few hours/days/weeks (depending on problem complexity) before asking.

Re:Bennett to the rescue!

By Razed By TV • Score: 5, Funny • Thread
This Bennett Hasselton thing has gotten out of hand enough to become a meme. Now I have to read his name when he isn't writing some clickbait article. I'm done with Slashdot.

Re:What is critical thinking?

By akozakie • Score: 5, Interesting • Thread

Ok... Now please explain what that huge difference you percieve is, the one that warrants the use of the words "highly doctored". Because to me this looks like just a longer version of the same thing. "Don't teach them to think, teach them to accept whatever the parents and the church want them to". Quite hard for me to find any redeeming aspect of that line. It's just a combination of catering to not-so-bright parents afraid of losing authority because of their own stupidity and to everyone in power, political, religious or any other, as dumber people are easier to control.

Re:What is critical thinking?

By akozakie • Score: 5, Interesting • Thread

Nah, you're wrong, thinking black-and-white. That's just a case of optimization error. They most definitely do want people with good critical thinking skills, just not too many - just enough to fill the right positions. They simply missed the golden ratio, too many people are on the "herd" education track. The positions are filled by idiots and companies lose money. They just assumed the "right" group is big enough to support their growth and they were wrong.

Recent Nobel Prize Winner Revolutionizes Microscopy Again

Posted by Soulskill in Science • View
An anonymous reader writes: Eric Betzig recently shared in the Nobel Prize for Chemistry for his work on high-resolution microscopy. Just yesterday, Betzig and a team of researchers published a new microscopy technique (abstract) that "allows them to observe living cellular processes at groundbreaking resolution and speed." According to the article, "Until now, the best microscope for viewing living systems as they moved were confocal microscopes. They beam light down onto a sample of cells. The light penetrates the whole sample and bounces back. ... The light is toxic, and degrades the living system over time. Betzig's new microscope solves this by generating a sheet of light that comes in from the side of the sample, made up of a series of beams that harm the sample less than one solid cone of light. Scientists can now snap a high-res image of the entire section they're illuminating, without exposing the rest of the sample to any light at all."

We demand more Bennett!

By Anonymous Coward • Score: 5, Funny • Thread

This is boring and their work sounds useless to the world-at-large. We need more world-changing articles by Bennett Hasselton coming up with better algorithms to solve the queueing issues for the ice lines at Burning Man.

*sigh*

By gstoddart • Score: 3 • Thread

Betzig came up with his Nobel-winning microscope (PALM) when he'd grown frustrated with the limitations of other microscope technologies. The so-called lattice light-sheet microscopy that he describes in Thursday's paper was the result of his eventual boredom with PALM.

*sigh* And some of us have yet to get bored with "pull my finger".

Re:Bennett Haselton on the implications

By RingDev • Score: 4, Interesting • Thread

I have a friend who works for a laser microscopy manufacturer. They use this technology (or systems very similar to it) to be able to record, in real time, cellular activity, INSIDE the cell, without killing the cell.

You know how it's 2014 and we still don't understand how memories are formed, or what the exact interactions between cancers and health cells are, or how we're always looking for new ways to deliver targeted medication/toxins on a cellular level?

Yeah, all of that ties back to this. Want to know what exactly is going on as the ebola virus invades a cell? This will let you see it, in real time.

This is the stuff that is the bedrock that leaps in scientific knowledge is based on. We are staring at the shoulders of a giant.

-Rick

Re:We demand more Bennett!

By Anonymous Coward • Score: 2, Informative • Thread

Ok, I'll bite. Why are we trolling Bennett Haselton?

He trolled us first with his endlessly long drivel pieces.

Decades-old Scientific Paper May Hold Clues To Dark Matter

Posted by Soulskill in Science • View
sciencehabit writes: Here's one reason libraries hang on to old science journals: A paper from an experiment conducted 32 years ago may shed light on the nature of dark matter, the mysterious stuff whose gravity appears to keep the galaxies from flying apart. The old data put a crimp in the newfangled concept of a 'dark photon' and suggest that a simple bargain-basement experiment could put the idea to the test. The data come from E137, a "beam dump" experiment that ran from 1980 to 1982 at SLAC National Accelerator Laboratory in Menlo Park, California. In the experiment, physicists slammed a beam of high-energy electrons, left over from other experiments, into an aluminum target to see what would come out. Researchers placed a detector 383 meters behind the target, on the other side of a sandstone hill 179 meters thick that blocked any ordinary particles.

Not to worry, Sheldon's on it...

By QuietLagoon • Score: 3 • Thread
Dark matter is primed for a whole new set of discovery, now that Dr. Sheldon Cooper has begun his research in the field.

Re:Confirming the Brady-Curran model

By pla • Score: 4 • Thread
Dark photons, or darkons , emitted by the boundary layer could simultaneously explain the missing mass and energy of the universe. Do I smell a Nobel prize?

Well, perhaps, but the referenced study failed to find any, thus ruling them out as an option.

Granted, science technically treats negative results as equally important to positive ones; society and the Nobel committee, however, have a pesky bias toward positive results.

weakly interacting != the weak nuclear force

By thegreatemu • Score: 5, Insightful • Thread

I got about 1 paragraph into the article before it became obvious that the author had no clue what the hell he was talking about. Maybe the old paper was better, but I don't have the patience to try to find out. From TFA:

They would interact only through the feeble weak nuclear force—one of two forces of nature that ordinarily flex their muscle only within the atomic nucleus—and could disappear only by colliding and annihilating one another

So many things wrong just in that sentence
1) Weakly Interacting Massive Particles (WIMPs) do have very low interaction cross sections (read: rates). There's sometimes an unfortunate ambiguity in the fact that phycisists have no imagination and gave two of the fundamental forces the names Strong and Weak. To say something interacts Weakly means that it interacts by exchange of W or Z bosons, not just that it has a low rate. However the WIMP interaction cross section has been known to be sub-Weak by several orders of magnitude for decades.

2) The Weak force's most obvious manifestation is in the production or absorption of neutrinos (beta decay or inverse beta decay) in a nucleus, but that's certainly not the only place it shows up; it's the mechanism for neutrino-electron scattering, muon decay, and a whole bunch of other stuff up to driving supernova explosions

3) Self-annihilation is the vanilla model for WIMP transformation, but there are plenty of sundaes-with-cherries-on-top models like self-interacting dark matter, which is discussed about 2 sentences later. Also, the chi is the symbol for the supersymmetric neutralino, often equated to a vanilla WIMP, and is not at all specific to the self-interacting dark matter model.

In short, cbtfaij;dr (can't bother to find an intelligent journalist; don't read)

From the article

By ChrisMaple • Score: 3 • Thread

"All of this has to be done in a very tight straitjacket."

Pretty much sums up the whole subject.

Re:That's all well and good...

By lgw • Score: 5, Informative • Thread

But, it makes the equations balance...

That's how science works. The predictions of the current model fail - the equations don't balance. You'll get many competing hypotheses each with its own suggestion for a new something that makes the equations balance. There were quite a few ideas for "dark matter" including a few "we just got gravity wrong" ideas.

The was no doubt at all that something was missing in established theory about galaxies and gravity - too much data to argue with. It's not like someone just invented dark matter out of the blue, then went looking for a use for it! There was no reason at the time to prefer any particular hypothesis.

Then the CMBR data gave us a fairly accurate measurement of the ratio of dark matter to matter in the universe long ago, and removed any doubt that it must be cold dark matter of some sort - not c or near-c particles, not a different theory of gravity, those ideas were falsified by the new data And in fact only the WIMP theory of dark matter accurately predicted the new measurement.

Dark energy is still early in this curve. There's no doubt about the data: there's something we don't know about the universe at very large scale, and it's the dominant effect at that scale. There are a bunch of hypotheses about what it might be, but that's about it right now.