the unofficial Slashdot digest

African States Aim To Improve Internet Interconnections

Posted by samzenpus in Technology • View
jfruh writes A rapidly growing percentage of Africans have access to the Internet — and yet most of the content they access, even things aimed specifically at an African audience, is hosted on servers elsewhere. The reason is a bewildering array of laws in different nations that make cross-border cooperation a headache, a marked contrast to places like Europe with uniform Internet regulations. At the Africa Peering and Interconnection Forum in Senegal, a wide variety of Internet actors from the continent are aiming to solve the problem.

Fake NVIDIA Graphics Cards Show Up In Germany

Posted by timothy in Technology • View
An anonymous reader writes "Several fake NVIDIA cards — probably GeForce GT 440 — have had their BIOS reflashed to report themselves as GeForce GTX 660. They were sold under the brand "GTX660 4096MB Nvidia Bulk" but only deliver 1/4 of the speed of a real GTX 660. Investigations are ongoing into who did the reflashing, but several hundred of them have already been sold and are now being recalled."

interesting case....

By MeistaDieb • Score: 3, Informative • Thread
The cards were all sold by the Distributor "Kosatec". Kosatec itself bought the cards directly from Point of View in the Netherlands (proof was given by invoices and transport packaging). The statement of Point of View is that they have not produced the cards... Could get real interesting :-D

I bought one of these for Litecoin mining

By flowerp • Score: 5, Interesting • Thread

I made a test order of one of these products for evaluating whether they are any good for mining. The 4 GB video RAM on the card and the supposed graphics chip on the card would have made a very good deal.

But it became apparent immediately that this was an outdated Fermi gerneration chip, despite the card being recognized as a GTX 660 by the driver. The card ended up on my scrap heap because it was useless for my purpose (high power consumption and low performance)

At the time I assumed it was some kind of OEM product (relabeling older chips under newer product names is very common in the GPU business). But the investigation of the c't magazine seem to indicate that there is some VBIOS tampering going on and that this is not happening with nVidia's blessing at all.

I'll be following the story closely to see what the outcome of this clusterfuck will be.

NASA Telescopes Uncover Early Construction of Giant Galaxy

Posted by timothy in Science • View
littlesparkvt (2707383) writes "Astronomers have uncovered for the first time the earliest stages of a massive galaxy forming in the young Universe. The discovery was made possible through combining observations from the NASA/ESA Hubble Space Telescope, NASA's Spitzer Space Telescope, ESA's Herschel Space Observatory, and the W.M. Keck Observatory in Hawaii. The growing galaxy core is blazing with the light of millions of newborn stars that are forming at a ferocious rate. The paper appears in the journal Nature on 27 August." (Here's the NASA press release.)

Re:Construction? So you are telling me there is a

By Tablizer • Score: 4, Funny • Thread

No, that's a "microscope", son.

Fish Raised On Land Give Clues To How Early Animals Left the Seas

Posted by samzenpus in Science • View
sciencehabit writes When raised on land, a primitive, air-breathing fish walks much better than its water-raised comrades, according to a new study. The landlubbers even undergo skeletal changes that improve their locomotion. The work may provide clues to how the first swimmers adapted to terrestrial life. The study suggests that the ability of a developing organism to adjust to new conditions—its so-called developmental plasticity—may have played a role in the transition from sea to land.


By fuzzyfuzzyfungus • Score: 5, Funny • Thread
"Ve...haf vays... of making you valk...'

Alternate link to story...

By SternisheFan • Score: 4, Insightful • Thread has a longer, more descriptive article/video...

Something smells fishy...

By penguinoid • Score: 3 • Thread

These creatures take to land like a fish takes to water.

Netflix Open Sources Internal Threat Monitoring Tools

Posted by timothy in News • View
alphadogg (971356) writes Netflix has released three internal tools it uses to catch hints on the Web that hackers might target its services. "Many security teams need to stay on the lookout for Internet-based discussions, posts and other bits that may be of impact to the organizations they are protecting," wrote Andy Hoernecke and Scott Behrens of Netflix's Cloud Security Team. One of the tools, called Scumblr, can be used to create custom searches of Google sites, Twitter and Facebook for users or keywords.

Re: I wonder why they released these.

By bsDaemon • Score: 5, Interesting • Thread

Their github account has 3 pages worth of stuff and they put a lot back into FreeBSD, too.

Netflix in news

By Comen • Score: 3 • Thread

Why is it this is the only article I can really find online about Netflix petitioning the FCC to now allow the Comcast TWC merger?

The three tools (because TFA article is, well...)

By xxxJonBoyxxx • Score: 3 • Thread

#1: Scumblr: Ruby-based, web-configured application that allows searching the Internet for sites and content of interest. Includes libraries for sites like Google, Facebook, and Twitter.
#2: Workflowable: Ruby gem that routes different kinds of detections from Scumblr to specific processes.
#3: Sketchy: takes screenshots of web finds for Scumblr.

(I might be a little off, but the Karma gods will surely reward me.)

Old Doesn't Have To Mean Ugly: Squeezing Better Graphics From Classic Consoles

Posted by samzenpus in Games • View
MojoKid writes If you're a classic gamer, you've probably had the unhappy experience of firing up a beloved older title you haven't played in a decade or two, squinting at the screen, and thinking: "Wow. I didn't realize it looked this bad." The reasons why games can wind up looking dramatically worse than you remember isn't just the influence of rose-colored glasses — everything from subtle differences in third-party hardware to poor ports to bad integrated TV upscalers can ruin the experience. One solution is an expensive upscaling unit called the Framemeister but while its cost may make you blanch, this sucker delivers. Unfortunately, taking full advantage of a Framemeister also may mean modding your console for RGB output. That's the second part of the upscaler equation. Most every old-school console could technically use RGB, which has one cable for the Red, Green, and Blue signals, but many of them weren't wired for it externally unless you used a rare SCART cable (SCART was more common in other parts of the world). Modding kits or consoles cost money, but if you're willing to pay it, you can experience classic games with much better fidelity.

No device necessary

By wbr1 • Score: 3, Insightful • Thread
Pretty interesting idea and a nice slashvertisement. How about instead, using an emulator,pushing a resolution that looks good onyour panel, and even possibly applying AA and other filters till it looks how YOU like, You have far more options for less cash that way. This reeks of monster cableitis to me.

Re:No device necessary

By drinkypoo • Score: 4, Interesting • Thread

I'm not going to buy an "expensive" upscaler, but I'd rather use the real consoles. I actually run into emulation errors with games I want to play on a semi-regular basis. I don't think that it's unreasonable to think about buying a scaler, even if it's unreasonable to buy this one.

It would be nice if someone would kick out a television with a fancy scaler built in. AQUOS and Bravia televisions (among others... I have an older example of the former, just barely pre-LED-backlight) have scalers which provide pretty good results for video sources at typical resolutions while also adding minimal latency, which is their primary appeal as compared to other lines — especially since the competition caught up in the black level department. But someone like Vizio (which is commonly favored by gamers due to sharp, clean scaling, if a bit jaggy at times) might consider offering some models with a seriously upgraded scaler and offering them to gamers as a means of improving their old-school gaming experience. Even people who don't own classic consoles, or who keep them in a box in their closet, might consider spending some extra money on such a feature even if they wind up never actually using it.

Not me, but some people :) Never know what the future holds for my TV, though.

It's Worth The Effort

By Tempest_2084 • Score: 5, Interesting • Thread
I've done this with all my classic consoles, and the results are worth it. Most consoles can support RGB without any mods, but a few require building an amp or a special board (the NES is the hardest to mod). I'm using RGB for my Genesis, SNES, Saturn, Dreamcast, N64, Neo Geo, NES, PSX, TurboGrafx, and SMS. On systems that could already support S-Video (Saturn, PSX, SNES, N64, DC) RGB isn't a huge step up but it is noticeable, but on systems that were stuck with composite (NES, Genesis, Neo Geo, TurboGrafx, SMS) it's a night and day difference.

I have all my consoles using Euro style SCART cables (these are fairly cheap and easy to find on ebay). The biggest issue is finding a nice CRT that supports RGB as most end user monitors do not. This is where the Sony PVM comes in. It's a high end CRT display that was mostly used by video production and television companies. These monitors support RGB along with S-Video and composite (although why you'd want to use composite after you have RGB is a mystery). They used to be pretty cheap, but now that more people are getting into RGB modding they've shot up in price over the past year or two. 20" models can still be found for $100 or so, but the larger models (27" tubes) can run $300 or more. If you're resourceful enough you can find them locally or on Craigslist as many local companies are finally starting to junk them. I have some friends who use the Frame Meister, but I think the PVM looks better. These systems were meant to be played on CRTs (not to mention you can use light guns).

In the end it's really not that hard to do, but there is an upfront cost involved. Still, if you're into classic gaming on original systems you should really look into it. This site has a lot of good info:

I don't mind old graphics, I mind 10,000 FPS

By GoodNewsJimDotCom • Score: 3 • Thread
My modern card turns on the high gear fans when I play... Asheron's Call 1. I just got it because it is my favorite MMORPG and there are no monthly fees anymore, just a one time fee of 10$. I don't know how to play with my driver software because I'd assume you could frame cap it. If anyone still remembers when Starcraft 2 came out, lots of people's cards fried because they were doing way over 60 FPS, and Blizzard needed to patch.

There's no reasons modern cards should engage into all out maximized FPS mode on old games. I also don't like the extra heat in the summer. I'm thinking of playing some AC1 in a few months when it gets colder. There's no reason AC1 should crank much heat at all, but I guess I just don't know how to turn my graphics card from going all out on an older game.

Other parts of the world?

By GrahamCox • Score: 3 • Thread
SCART was more common in other parts of the world

What other parts? Where are you from? If you include a relative reference, at least mention what it's relative TO. You know, the internet is worldwide, FFS.

$33 Firefox Phone Launched In India

Posted by samzenpus in Technology • View
davidshenba writes Intex and Mozilla have launched Cloud FX, a smartphone powered by Mozilla's Firefox OS. The phone has a 1 GHz processor, 2 Megapixel camera, dual SIM, 3.5 inch capacitive touchscreen. Though the phone has limited features, initial reviews say that the build quality is good for the price range. With a price tag of $33 (2000 INR), and local languages support the new Firefox phone is hitting the Indian market of nearly 1 billion mobile users.

1 Billion Mobile Users?

By sdguero • Score: 5, Informative • Thread
Umm... I guess this assuming that 80% of the people in India are smart phone users. That last i heard, smart phone usage in the USA was around 65%.

The average income in India is $1,500 USD/year vs the USA where it is $50,000 USD/year (roughly 33 times higher). $33 dollars doesn't sound like much to people in the USA, but that is 2.2% of the average Indian person's annual salary. That 2.2% number would be around $1100 outlay for the average American worker.

Perspective is everything when you try compare the consumer market between countries like the USA and India.

Re:VS a top of the line 3 year old phone...

By Em Adespoton • Score: 4, Insightful • Thread

Try to sell the Galaxy Nexus across India in volume and maybe you can tell us the answer :)

Re:1 Billion Mobile Users?

By narcc • Score: 4, Insightful • Thread

But it's significantly less than competing smartphone alternatives in that market. The price is even lower than many feature phones (most, if you only count dual-sim models) at just Rs 2000.

It's still a big investment, sure, but you're getting a LOT more bang for your rupees.

Re:1 Billion Mobile Users?

By Charliemopps • Score: 4, Interesting • Thread

It's 1 billion mobile users, not smartphone users. When I was in Africa, everyone had a feature phone... everyone. You could buy them at Kiosks for less than $10, and phone cards to fill them with minutes. A $300 phone would be completely insane there... but a $33 phone? Yea, they'd have to save up but that's doable. Especially when a lot of people I ran into were using the feature phone like a desktop... running entire businesses off the things.

Re:1 Billion Mobile Users?

By jma05 • Score: 5, Informative • Thread

> Some villages only have one cell phone that everyone shares

You don't seem to be talking from experience and seem to be simply conjecturing. I am in India. I have never heard of any village sharing just one cell phone. It is not even plausible. Now, it used to be, several decades ago, that there were just a handful of landlines per village. But a cell tower will not be setup unless the provider is sure that there is demand for enough to make an economic case. And there always is. Mobile phones are not expensive (but not cheaper than the cheap options in US). Mobile plans are however incredibly cheap compared to US. I know poor ($13 rent for a family of 4) families in India who have multiple mobile phones, one per working adult.

> So think of it as each person in India putting out $1100 for their phone

Poor people are not buying smartphones yet (its the lower middle class and up that is driving smart phones now). They still buy Nokia dumb phones and are now beginning to shift to cheap Android phones at $100. Firefox Phone helps by further lowering that barrier of entry. The minimum monthly talk refill plan I know is 30 *cents*... very cheap. You may not get many outgoing minutes, but you don't get charged for incoming calls, unlike US. So everyone in India who needs one, can afford a mobile phone plan.

$1100 for a phone is very expensive in India. I know several people who have them, but they are all rich. And it is often a status symbol rather than for an actual need.

> which they use in lieu of land line, TV and computer

No one in India uses a smart phone in lieu of a TV. Having cable TV (60-80 channels) in India is very cheap ($3 per month in poor neighborhoods). Indian mobile data plans start very cheap ($2) but are not robust enough to be used for routine video consumption yet. They won't be replacing TV anytime soon. Anyone who owns a $1100 mobile phone already has a pricey HDTV.

Mobile phones are also not replacing computers yet since most of the phone users, unlike US, were not computer users to begin with. People here use cheap service stations nearby, to pay bills online, where the operator sits in front of an online PC, accepts cash and pays bills for a few cents of service charge. This is much simpler for most people than using data plans and mobile web apps, for now. Around here (a small town), there is such a tiny store for every neighborhood and they provide small jobs that serve populace that is not yet computer savvy enough.

New NRC Rule Supports Indefinite Storage of Nuclear Waste

Posted by samzenpus in YRO • View
mdsolar writes in with news about a NRC rule on how long nuclear waste can be stored on-site after a reactor has shut down. The five-member board that oversees the Nuclear Regulatory Commission on Tuesday voted to end a two-year moratorium on issuing new power plant licenses. The moratorium was in response to a June 2012 decision issued by the U.S. Court of Appeals for the District of Columbia that ordered the NRC to consider the possibility that the federal government may never take possession of the nearly 70,000 metric tons of spent nuclear fuel stored at power plant sites scattered around the country. In addition to lifting the moratorium, the five-member board also approved guidance replacing the Waste Confidence Rule. "The previous Waste Confidence Rule determined that spent fuel could be safely stored on site for at least 60 years after a plant permanently ceased operations," said Neil Sheehan, spokesman for the NRC. In the new standard, Continued Storage of Spent Nuclear Fuel Rule, NRC staff members reassessed three timeframes for the storage of spent fuel — 60 years, 100 years and indefinitely.

What else can they do?

By bobbied • Score: 5, Interesting • Thread

Yucca mountain is a no go for political reasons, not scientific ones, so what else can we do?

The really sad thing is that there still is a lot of useable fuel in all that if we here allowed to reprocess it. Not to mention that reprocessing would greatly reduce the size of the high level waste. Carter really messed up with that decision...

So, for now, it's store in place and guard the stuff. But this is only really a problem until it cools enough to not require being under water anymore. After that guarding it isn't that hard or expensive. It can be packaged in such a way that getting into it would take hours and industrial equipment. Guarding it just means walking by every day or so and making sure nobody is messing with the containers.


By Ralph Wiggam • Score: 5, Insightful • Thread

Let's blame the people responsible- Nevada voters. The politicians are just representing their constituents. I supported the Yucca Mountain project before I moved to Nevada and I would be an asshole to change my opinion afterward.

The proposed site is over 100 miles from Vegas in the absolute middle of nowhere. Even if they stored the waste in a big open pit above ground, it still wouldn't affect anyone.

But people here are terrified about transporting the waste along the rail lines through town. There is a freight train that goes literally 100 feet from my office every day with tanker cars full of ammonia and sodium hydroxide. Nobody bats an eye.

Re:central storage or n^x security guard costs / s

By crioca • Score: 4, Insightful • Thread

It's about damned time we started building new nukes

I've been a proponent of nuclear power for years, but given how fast the cost of solar power has been falling, I think the time for investing heavily in nuclear power has passed.

On site transmutation

By mdsolar • Score: 3 • Thread
A portable accelerator could transmute the waste at each reactor site. The places are already well connected to the grid so bringing power to transmute the waste to stable isotopes would not be a problem. Just think of nuclear power as something that must be repaid.

Re:central storage or n^x security guard costs / s

By brambus • Score: 5, Informative • Thread
I'm quite aware of how radiotoxicity of spent nuclear fuel works. There are in fact graphs detailing it. Fast reactors and actinide burners prevent the actinides from entering the waste stream in the first place, hence why their waste is below original uranium ore radiotoxicity levels after a few hundred years. After that, you can essentially throw the stuff back into the pit you got it out of, knowing that you've actually lowered the overall radiotoxicity of the original material. For current LWRs on a once-through cycle this doesn't occur until some hundreds of thousands of years in the future.

CenturyLink: Comcast Is Trying To Prevent Competition In Its Territories

Posted by Soulskill in News • View
mpicpp sends word that CenturyLink has accused Comcast of restricting competition in the development of internet infrastructure. CenturyLink asked the FCC to block the acquisition of Time Warner Cable to prevent Comcast from further abusing its size and power. For example, Comcast is urging local authorities to deny CenturyLink permission to build out new infrastructure if they can't reach all of a city's residents during the initial buildout. Of course, a full buildout into a brand new market is much more expensive than installing connections a bit at a time. Comcast argues that CenturyLink shouldn't be able to cherry-pick the wealthy neighborhoods and avoid the poor ones. CenturyLink points out that no other ISP complains about this, and says allowing the merger would let Comcast extend these tactics to regions currently operated by Time Warner Cable.

Re:what's wrong with cherry picking?

By msauve • Score: 5, Interesting • Thread
The other solution is to allow partial buildouts, but ensure each phase is balanced between "rich" and "poor" areas. That lowers the cost of entry while ensuring fair competition.

There is no competition...

By Kenja • Score: 3 • Thread
The only other options are satellite and DSL... they successfully prevented competition when they threw their "we no share our cables" fit awhile back.

Re:what's wrong with cherry picking?

By ewieling • Score: 5, Insightful • Thread
If it took 10 years for Comcast to provide internet service to the *entire* city, then CenturyLink should have 10 years to do the same. Seems fair to me.

Re:what's wrong with cherry picking?

By Bradmont • Score: 4, Insightful • Thread
If there is research to do regarding what service to choose, how does comcast have a monopoly?

Re:what's wrong with cherry picking?

By Nyder • Score: 4, Informative • Thread

Okay, after I did this post, I did some research and we apparently have other DSL providers. It's possible that you can get them in the same area's as Centurylink, i'm not going to bother to ask them If i can get there service.

So maybe my post is wrong. Sorry.

Underground Experiment Confirms Fusion Powers the Sun

Posted by Soulskill in Science • View
sciencehabit writes: Scientists have long believed that the power of the sun comes largely from the fusion of protons into helium, but now they can finally prove it (abstract). An international team of researchers using a detector buried deep below the mountains of central Italy has detected neutrinos—ghostly particles that interact only very reluctantly with matter—streaming from the heart of the sun. Other solar neutrinos have been detected before, but these particular ones come from the key proton-proton fusion reaction that is the first part of a chain of reactions that provides 99% of the sun's power.

Re:Thought that was obvious... ?

By Anonymous Coward • Score: 5, Funny • Thread

it caught me by surprise as well. but thinking about it more it's mind blowing to think that there are a lot of things we take as fact when they may just be assertions. like fusion powering the sun, for example. I call this the Wikipedia phallacy.

I think (hope) you mean "fallacy"...

No, he has it right. By analogy with "democracy", "oligarchy", "anarchy", and so forth, naturally Wikipedia is a "phallacy" since it's quite well established that many of the editors there who run the place are pricks.

Underground Eureeka!

By Tablizer • Score: 4, Funny • Thread

Making huge discoveries about the universe without leaving mom's basement? Nerdgasm!

Re:That's not how science works

By khallow • Score: 5, Insightful • Thread
Or we could just realize that "proof" in empirical science means something different than it does in pure mathematics.

Re:Thought that was obvious... ?

By lgw • Score: 5, Interesting • Thread

Another surprising fact about fusion in the Sun is that the fusion power generated is about 1.5 watts per ton of core. Even in conditions in the core of the sun, fusion is hard, and the particular reaction process just confirmed was at the end of a long chain of reasoning explaining what we do see. So I think this actually give evidence that a bunch of stuff in Wikipedia about processes in the Sun is also true. (If a different fusion process was found, then we'd likely be wrong about how much power is generated, and thus about the rate and manner that that power eventually makes it to the surface and gets radiated).

Re:Thought that was obvious... ?

By mark_osmd • Score: 4, Interesting • Thread
Another surprising fact, the Sun's core is so dense (150 g/cc) that a metric ton of core only needs the volume of a cube 19cm per side to occupy.

Ask Slashdot: What To Do About Repeated Internet Overbilling?

Posted by timothy in Ask Slashdot • View
An anonymous reader writes "AT&T has been overbilling my account based on overcounting DSL internet usage (they charge in 50 gigabyte units after the first 150). I have been using a Buffalo NFinity Airstation as a managed switch to count all traffic. As you may know, this device runs firmware based on dd-wrt and has hidden telnet functionality, so I am able to load a script to count traffic directly onto the device. I have an auto-scraper that collects the data and saves it on my computer's hard disk every two minutes while the computer is running. While it is not running, the 2 minute counters accumulate in RAM on the device. Power problems are not normally an issue here; and even when they are I can tell it has happened. The upshot of all this is I can measure the exact amount of download bandwidth and a guaranteed overestimate of upload bandwidth in bytes reliably. I have tested this by transferring known amounts of data and can account for every byte counted, including ethernet frame headers. AT&T's billing reporting reports usage by day only, lags two days, and uses some time basis other than midnight. It is also reading in my testing a fairly consistent 14% higher whenever the basis doesn't disturb the test by using too much bandwidth too close to midnight.

AT&T has already refused to attempt to fix the billing meter, and asserts they have tested it and found it correct. Yet they refuse to provide a realtime readout of the counter that would make independent testing trivial. I've been through the agencies (CPUC, FCC, and Weights & Measures) and can't find one that is interested, AT&T will not provide any means for reasonable independent testing of the meter. It is my understanding that if there is a meter and its calibration cannot be checked, there is a violation of the law, yet I can't find an agency that can even accept such a claim (I'm not getting "your claim is meritless", but "we don't handle that"). If indeed they are not overbilling, my claim of no way to verify the meter still stands. My options are running thin here. So that my account can be identified by someone who recognizes the case: 7a6c74964fafd56c61e06abf6c820845cbcd4fc0 (bit commitment).


By TapeCutter • Score: 5, Insightful • Thread
I think people are missing the point of TFA, why are the "weights and measures" people not interested? If it was a greengrocer with a rigged scale he would be in handcuffs explaining himself to a judge.


By DrYak • Score: 5, Insightful • Thread

Imagine if you went to buy milk and bought a gallon but were charged for 1.25 gallons because of spillage in the bottling plant.

Or to be more similar: you got charged 1.25, because they determine the price by weighting it and thus are also weighting the glass milk bottles and the hard plastic crate carrying them.
And when you ask them why you don't get the same amount of gallons that you measure in your kitchen and on their bill, they just answer "No, everything is okay, our bill is 100% right.". Without ever mentioning that you need to take that overhead into account. Without you having any way to check it or control the milkbottle+crate weighting process neither.

Re:What are you downloading?

By PopeRatzo • Score: 5, Funny • Thread

I quite likely have, but I am not so crass as to go around asking random strangers in the gym what their orientation is.

This is why I love Slashdot. I discussion of internet overcharging and ATM encapsulation quickly pivots to the etiquette of showering with gay men at the gym.

Honestly, I love each and every one of you. In a purely platonic way, of course, though given enough vodka and grapefruit juice, who knows?.

Re:DSL paload + ATM = 16%

By TapeCutter • Score: 5, Informative • Thread

My 2x4 lumber is actually 3.5" wide.

Only if it has already been dried and dressed, it comes off the greenchain at the sawmill as 2X4 (to within 1/16th of an inch), as it dries the dimensions change, dressing the timber takes an 1/8th of an inch off each side. If a lumber yard attempted to sell you undressed timber as 2X4 that was actually 3.75 X 1.75 then the weights and measures people would definitely be interested. Here in Oz dressed timber is now advertised with real dimensions not it's undressed dimensions The practice goes way back to the days when most buildings used undressed timber for structural purposes. These days carpenters don't normally build frames on site, it's all prefab frames and roofs that just bolt together, for that technique to work it needs the more consistent dimensions of dressed timber.

Nobody is scamming you out of useful timber, the industry terminology is well defined and is not hidden from the customer. The point of TFA is that comcast's network metering methods are hidden from customer scrutiny and nobody at weights and measures seems to give a damn.


By jrumney • Score: 4, Insightful • Thread
They are not interested because they do not understand what is happening. They ask the industry experts they have access to, and they will all give the telco side of the story. It needs someone to sit down with them over a beer and explain it in terms they can understand.

Slashdot Talks WIth IBM Power Systems GM Doug Balog (Video)

Posted by Roblimo in Hardware • View
Yesterday we had a story titled ' IBM Gearing Up Mega Power 8 Servers For October Launch.' In the intro Timothy wrote, ' for a video interview with Balog on how he's helping spend the billion dollars that IBM pledged last year on open source development.' This is that video, and in it Balog tells us how much IBM loves Linux and open source, and how they're partnering with multiple distros, recently including Ubuntu. So get ready for Power 8 servers in October. IBM is pushing them like mad -- especially in the Linux/FOSS realm. ( Alternate Video Link)

Refreshingly 'normal' interview

By schweini • Score: 4, Insightful • Thread
I just wanted to say that I was pleasantly surprised that this guy seems relatively buzzword free and seems to know his stuff. Obviously he has his corporate agenda, but I would really like more higher-ups in big companies to do interviews like this.

Eye Problems From Space Affect At Least 21 NASA Astronauts

Posted by Soulskill in Science • View
SternisheFan sends this report from Universe Today: How does microgravity affect your health? One of the chief concerns of NASA astronauts these days is changes to eyesight. Some people come back from long-duration stays in space with what appears to be permanent changes, such as requiring glasses when previously they did not. And the numbers are interesting. A few months after NASA [said] 20% of astronauts may face this problem, a new study points out that 21 U.S. astronauts that have flown on the International Space Station for long flights (which tend to be five to six months) face visual problems. These include "hyperopic shift, scotoma and choroidal folds to cotton wool spots, optic nerve sheath distension, globe flattening and edema of the optic nerve," states the University of Houston, which is collaborating with NASA on a long-term study of astronauts while they're in orbit.


By Type44Q • Score: 5, Insightful • Thread

Obviously orbital habitats either need to be spun-up or contain living quarters located within centerfuges.

Re:What can be done about this?

By ArcadeMan • Score: 4, Funny • Thread

Wouldn't that be "problem 33% solved?"

Re:yeah, i'm not interesting in going to space

By SternisheFan • Score: 5, Informative • Thread
From CBCnews, Mar 13, 2012:

Astronauts have complained for decades about vision problems such as blurriness following trips into space. A recent NASA survey of 300 astronauts found correctible near and distance vision problems in 48 per cent of astronauts who had been on extended missions and 23 per cent of those who had been on brief missions. In some cases, they lasted for years after the astronauts returned to Earth.

Fluid shifting toward head causes problems

In the new study, the astronauts had spent an average of 108 days in space. Their eye abnormalities were similar to those seen in patients on Earth with idiopathic intracranial hypertension. Patients with the condition have increased pressure around their brains for no apparent reason.

Among the astronauts in the study:

33 per cent had expansion of the space filled with cerebral spinal fluid that surrounds the optic nerve, which connects the eye to the brain.

22 per cent had flattening of the rear of the eyeball.

15 per cent had bulging of the optic nerve.

11 per cent had changes in the pituitary gland and its connection to the brain.

An earlier NASA-sponsored study of seven astronauts, published last November in the journal Ophthalmology, found similar abnormalities and also noted that they were similar to those experienced by patients on Earth suffering from pressure in the head. But it noted that astronauts did not experience symptoms usually associated with that problem on Earth, such as chronic headache, double vision or ringing in the ears.

The earlier study suggested that the problems might be caused by fluid shifting toward the head during extended periods of time in microgravity. This could result in abnormal flow of spinal fluid around the optic nerve, changes in blood flow in the vessels at the back of the eye, or chronic low pressure within the eye, the researchers said.

NASA needs to get it's act together

By TomRC • Score: 3 • Thread

We've long known what will likely avoid these sorts of problems - create a rotating environment to simulate gravity.
While the physics principle is simple, engineering a safe rotating station is probably quite challenging.
The sort of thing NASA was created to investigate...

Zero-G is bad long term, but what about 1/6-G?

By deathcloset • Score: 3 • Thread
This is an unintuitive wild speculation, but I wonder if these effects are a linear function of the gravity or if there is a more complex interaction.
In other words, if Alice spent 6 monts in zero-G and Bob spent 6 months in 0.166-G, and assuming equal eye health, would Bob have less damage than Alice or more?

Obviously the human body emerged out of a 1-G environment, so the eye has evolved with those pressures. But just because removing those pressures completely may result in harm, that is not to say that removing those pressures partially would be harmful.

The only non-zero-G astronauts I know of were the Apollo folks - but I can't find any information (or anectdotes from them) on the difference in physiological effects of zero-g versus 1/6th-G.

It seems like they would have experienced less intercranial pressure and would have had an actual reference for up and down.

Oh space be a harsh mistress.

DoT Proposes Mandating Vehicle-To-Vehicle Communications

Posted by Soulskill in YRO • View
schwit1 sends word that the Dept. of Transportation's National Highway Traffic Safety Administration has given notice of a proposal (PDF) for a new car safety standard that would require vehicle-to-vehicle communication equipment in all new passenger cars and light trucks. The NHTSA thinks this will facilitate the development of new safety software for vehicles. They estimate it could prevent over 500,000 crashes (PDF) each year. "Some crash warning V2V applications, like Intersection Movement Assist and Left Turn Assist, rely on V2V-based messages to obtain information to detect and then warn drivers of possible safety risks in situations where other technologies have less capability. ... NHTSA believes that V2V capability will not develop absent regulation, because there would not be any immediate safety benefits for consumers who are early adopters of V2V." The submitter notes that this V2V communication would include transmission of a vehicle's location, which comes with privacy concerns.

Re:Official Vehicles

By macs4all • Score: 4, Informative • Thread

because that amounts to surveillance. The closest thing to current system would be a detector placed at certain locations and would only ticket vehicles within 50meter radius. This would be similar to traffic cameras.

...Or those mysterious PAIRS of buried "loop detectors" (complete with a SHIELD buried between them, so that the "triggers" produced are crisply-timed), that have appeared (complete with the $50k (guessing) controller-boxes hiding in the bushes off the side of the road). What do you think a PAIR of loop detectors (positioned so you drive over one, then the other, in quick succession) in the SAME LANE is for?

I'll give you a hint: They are ALWAYS positioned within eyesight of the tall "lighting" towers (you know, the ones with the pan/tilt/zoom cameras in them, that the gummint called people crazy and paranoid for saying they (the hidden cameras) were there, until they started broadcasting the signals from them on the TV news every day).

Check it out. I am an embedded developer who has some experience working with vehicle loop detectors, and I can recognize a SPEED DETECTOR when I see one (that's why there are two detectors, to develop an "interval" between the signals, and the shield is to make the "detection time" more reliable (loop detectors were originally not designed to be so precise)).

They started appearing about 5 years ago on the interstate system in the state in which I live, and I have seen them in other states of the U.S.A., too. But no one EVER talks about them...

No, it's not anonymous. It's full tracking.

By Animats • Score: 5, Informative • Thread

Here's a more technical discussion from NHTSA. At page 74-75, the data elements of the Basic Safety Message I and II are listed. The BSM Part I message doesn't contain the vehicle ID, but it does contain latitude and longitude. The BSM Part II message has the vehicle's VIN. So this is explicitly not anonymous.

Back in the 1980s, when Caltrans was working on something similar, they used a random ID which was generated each time the ignition was switched on. That's all that's needed for safety purposes. This system has a totally unnecessary tracking feature.

Most of this stuff only works if all vehicles are equipped. It also relies heavily on very accurate GPS positions. However, there's no new sensing - no vehicle radar or LIDAR. The head of Google's autonomous car program is on record as being against V2V systems, because they don't provide reliable data for automatic driving and have the wrong sensors.

If something is going to be required, it should be "smart cruise" anti-collision radar. That's already on many high-end cars and has a good track record. It's really good at eliminating rear-end collisions, and starts braking earlier in other situations such as a car coming out of a cross street. Mercedes did a study once that showed that about half of all collisions are eliminated if braking starts 500ms earlier.

V2V communications should be an extension of vehicle radar. It's possible to send data from one radar to another. Identify-Friend-Foe systems do that, as does TCAS for aircraft. The useful data would be something like "Vehicle N to vehicle M. I see you at range 120m, closing rate 5m/sec, bearing 110 relative. No collision predicted". A reply would be "Vehicle M to vehicle N. I see you at range 120m, closing rate 5m/sec, bearing 205 relative. No collision predicted". That sort of info doesn't involve tracking; it's just what's needed to know what the other cars are doing. It's also independent of GPS. Useful additional info would be "This vehicle is a bus/delivery truck, is stopped, and will probably be moving in 5 seconds.", telling you that the big vehicle ahead is about to move and you don't need to change lanes to go around it.

"Braking Hard Alert"

By Macdude • Score: 3 • Thread

How about we just implement a system that when a vehicle brakes hard it also send out a low power directional signal (to the rear) that reads "Hard Braking, #1 vehicle, ".

Then every vehicle that receives it replies with "Hard Braking, #2 vehicle, " and every vehicle that receives it replies with "Hard Braking, #3 vehicle, ", etc. Then at some predetermined cutoff point (number dependant on the vehicle's speed) the vehicles stop propagating the message.

The point of the random number is so that your vehicle can ignore multiple receipts of the same braking event while not identifying the vehicle.

That should cover the vast majority if situations that you want your vehicle to warn you about.

Re:Oh, really?

By Quirkz • Score: 4, Funny • Thread

His real crime was ending his sentence with a preposition.

Hey, look, you just ended a sentence with "a preposition."

the purpose is tracking cars

By Catbeller • Score: 3 • Thread

Forget the happy horseshit about super-safe robot cars. We don't have those, and they won't work when we do. This is about the ability to track all the vehicles in the world, either by private entities who will backdoor the info to government and political groups, or straight-up security force tracking. Not just here, but all over the world. We are building turnkey police state infrastructure. If you can't grasp this, you might want to contemplate how privileged you are not to ever feel endangered by cops or polical opponents like Scientology or the Moonies. Do not give the monkeys the key to the banana plantation. Once you are in a worldwide prison, there is no escape.

How Red Hat Can Recapture Developer Interest

Posted by Soulskill in Developers • View
snydeq writes: Developers are embracing a range of open source technologies, writes Matt Asay, virtually none of which are supported or sold by Red Hat, the purported open source leader. "Ask a CIO her choice to run mission-critical workloads, and her answer is a near immediate 'Red Hat.' Ask her developers what they prefer, however, and it's Ubuntu. Outside the operating system, according to AngelList data compiled by Leo Polovets, these developers go with MySQL, MongoDB, or PostgreSQL for their database; Chef or Puppet for configuration; and ElasticSearch or Solr for search. None of this technology is developed by Red Hat. Yet all of this technology is what the next generation of developers is using to build modern applications. Given that developers are the new kingmakers, Red Hat needs to get out in front of the developer freight train if it wants to remain relevant for the next 20 years, much less the next two."

If you can install it, who cares?

By msobkow • Score: 3 • Thread

If you install the newer packages you want, who cares what the "default" package is?

Personally I'd much rather a distro that lets me choose which version of packages to install rather than shoving one down my throat randomly during updates of the system.

Granted, the Debian stable I run isn't full of the latest shiny, shiny, but it isn't causing update problems by rolling out new versions of packages, either. Both Debian stable and RedHat RHEL are focused on stability, not bleeding edge development. No one in their right mind runs production systems on untested versions of packages, and no one (not even banks) can afford to do constant regression testing on the latest releases of software just because it's "new."

I'm constantly surprised at how many people opt for downloading the "production" version of my own project, even though that really was just a peg in the dirt of functionality, not some big fancy schmancy roll-out that went through more testing than other releases. There are bug fixes and new features in the latest and greatest, but a lot of people don't want that -- they want that peg in the dirt, and are content to wait for an SP1 to get access to the new features and bug fixes.

Don't forget it can often take a few months to properly regression test software. It isn't just an issue of booting with the latest version and making sure it starts running -- it's testing how it responds to having network cables yanked, power flipped off hard, sometimes even yanking hardware components while a box is running. Serious servers aren't something you just push out after running them with a dozen users for a week.

Re:IT departments, on the other hand...

By x_t0ken_407 • Score: 4, Insightful • Thread

I'd also prefer RHEL over all others, except for the costs (which are inconsequential to me, usually). I too get stuck trying to push CentOS...most shops I've been at want to be able to point the finger at someone, hence paying for RHEL. What's odd is, the shop I've been working at for the past year actually uses Ubuntu LTS, so I've (unfortunately, or perhaps, fortunately, in the name of expanding my knowledge) had to learn the system pretty quickly. Haven't had any problems with it so far, an actually I'm impressed with the LTS version's stability (while originally I abhorred it for no actual reason, heh). Seems like a cross between RHEL's stability and Fedora's up-to-date packages.

Re:Mission Critical ... Red Hat... LOL..

By bill_mcgonigle • Score: 4, Interesting • Thread

The whole point was that developers influence the choice of distro on the server

There must be cases where this is true. However, it's really unclear to me why most developers would care and why they would feel themselves qualified if they have competent sysadmins to work with.

When I've got my sysadmin hat on, most of the developers I work with are developing on Macs. They have no hangups about their code being deployed on EL systems in a big data center. Nobody is clamoring for a shelf full of MacPro tubes to deploy on.

When I've got my developer hat on, I usually write on a Fedora machine. But I'm not daft enough to try to run Fedora on a server and have to worry about the maintenance cycle. I put my configs in a puppet module that pushes the code out to whichever VM I'm going to run it on, regardless of the OS, hypervisor, hardware, or country that code is bound for.

If my code doesn't run on a particular distro, then my code is probably broken (or my devops is hosed).

Maybe there are some startups with a bunch of kids and one third-careeer CEO and they all tell him what's going to happen. Good for them, I guess. Someday a sysadmin might come in and help them fix their stack. Let's not speak of the failwhale.

Preaching to the choir

By laffer1 • Score: 3 • Thread

I work at a large university. IT gave us two options for operating systems on our servers, Redhat or Windows. They also offer a DIY vmware setup. Rather than having IT manage our servers, I have to do it just so we can run Ubuntu. It is impossible to run certain packages like OpenCPU on Redhat because no one ever bothered to port it. Before you jump to the conclusion that linux is linux, it's really not. You can blame Ubuntu for going off the beaten path or Redhat for not keeping up with the times but some software packages only run on one linux distro without considerable effort. Conversely, the only supported backup solution for our servers is IBM tivoli crap and I went through hell to convert the rpm based installer into something that would work on Ubuntu LTS. IBM doesn't get that Ubuntu (or debian derived) distros are popular now either.

As a *BSD guy, I find both Ubuntu and Redhat irritating but at least ubuntu has apt-get. Funny thing is I started on Redhat 5.0 in '99 or so as my first *nix like os. Back then they had a desktop that didn't suck though.

Trendy != Better

By Etcetera • Score: 5, Insightful • Thread

Given that developers are the new kingmakers, Red Hat needs to get out in front of the developer freight train if it wants to remain relevant for the next 20 years, much less the next two.

It's very hard to avoid a snarky response, but I'll try.

* Developers are not kingmakers
* Developers are not system administrators
* Developers don't understand operations
* Developers often don't understand scale engineering unless they can abstract it away by not thinking too hard about anything
* Red Hat Enterprise Linux (and its derivatives) are not intended to be shiny new, but to be reliable
* Use Fedora if you want bleeding edge, or re-package things yourself. RPMs aren't hard.