Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - W1nTry

Pages: 1 2 [3] 4 5 ... 53
41
Well even if you haven't there's always someone out there with nothing to do and funding, so lo and behold some guys thought to thorougly investiquiry what exactly keeps a bike upright inspite of the many designs and shapes and sized and their answer isn't as clear cut as we've believed it to be for well over 100 years...

Quote
Moving bikes stay upright—but not for the reasons we thought
By John Timmer | Last updated about 22 hours ago

The phrase "just like riding a bike" is used to refer to something that, once learned, you never forget how to do. As it turns out, bikes make that easy on us. If a typical bicycle is moving forward fast enough, it tends to remain upright and steer in a straight line, even if the rider takes his or her hands off the handlebars. In fact, you can set a bicycle rolling without a rider at all, and it tends to remain upright and roll in a straight line.

Attempts to understand this stability have been around almost for as long as bicycles have existed, and most people have accepted explanations focused on gyroscopic forces and the location of the steer axis. But a team of engineers has now built a bicycle that eliminates both of these features, but still manages to stay upright.

The authors of the new paper do an amazing review of how the popular explanations for a bicycle's stability got so, well, popular—their first reference dates back to 1869, and 10 of the 19 are over 100 years old. In one case, they spot a mathematical error (a reversed sign) in a 1910 reference.

Most people have seen a gyroscope in action, so the stability of a rapidly rotating wheel should be fairly intuitive, making this a focus from the start. People have built bicycles with counter-rotating wheels and found that they still remain upright, so that can't be all of the story. Another focus has been on the fact that the area where the front wheel touches the ground is a bit behind the axis of steering, which also seems to add stability to traditional bicycle designs.

To test the relative contributions of these factors, the authors eventually built their own computer model of a bicycle and started playing around with various features. It turned out that they could eliminate both the gyroscopic and the negative trail factors, and the bike would still be stable as long as it was moving faster than 2.3 meters (7.5 feet) per second. They could even move steering to the rear wheel and produce a stable design.

The apparently unreasonable stability of different bicycle designs must have suggested that their model had probably lost touch with reality, so the authors went out and built a bike with a counter-rotating wheel to get rid of gyroscopic effects, as well as a negligible (4mm) trailing between the front wheel and the steering. As their model predicted, it tended to stay upright, and would steer into any falls that their grad students tried to induce.

What their math can't apparently tell them is why so many different bike designs tend to stay upright. "Why does this bicycle steer the proper amounts at the proper times to assure self-stability?" they muse. "We have found no simple physical explanation equivalent to the mathematical statement that all eigenvalues must have negative real parts." In other words, they can see why the math works out the way it does, but can't figure out what physical properties correspond to that.

The best they can surmise is that the stability is related to the ability of the bike to steer into a fall if it starts to lean, and that there are multiple ways of constructing a bike that does this.

So in short, our bikes are held upright by Dark matter interacting with gravity waves whilst catering for the heisenberg's uncertainty principle...

http://arstechnica.com/science/news/2011/04/moving-bikes-stay-uprightbut-not-for-the-reasons-we-thought.ars

No wonder you 'never' forget...

42
PC Gaming / Spiral Knights
« on: March 21, 2011, 11:29:27 AM »
It's a free MMO from Sega and indie developer studio Three Rings. It's not out just yet but it could prove to be a GREAT time waster for those into these kind of games:



Quote
Sega details free-to-play MMO Spiral Knights, Ars readers can play early
By Andrew Webster | Last updated 2 days ago

Sega has teamed up with indie studio Three Rings—a development team that works out of a steampunk submarine and has worked on browser-based games like Puzzle Pirates—to create Spiral Knights, an upcoming free-to-play MMO set in a sci-fi-meets-fantasy world. And Ars readers can get in on the action early.

While it's an MMO, Spiral Knights also has a large focus on co-op, action RPG-style play. You can team up with up to four friends, and run around the constantly changing 3D world bashing monsters. And, of course, there are all the genre mainstays like character customization, weapon crafting, and guilds. Take a look at the game in action:
Spiral Knights

Spiral Knights is expected to launch in April, and is currently in a closed beta. But if the video above has whet your appetite, you're in luck. The first 500 Ars readers to register at this link will be able to play the game ahead of its release. The beta will be available from now until until March 28 and, again, is only available to the first 500 people who register. So what are you waiting for?

http://arstechnica.com/gaming/news/2011/03/sega-details-free-to-play-mmo-spiral-knights-ars-readers-can-play-early.ars

I could see myself playing this at work with my xbl crew XD

43
Yeah, it's EXACTLY what it sounds like, cars that can drive themselves! and whilst I agree with whats said in the quote to follow, one can't help but wonder about the other end of the spectrum.... matrix, terminator, etc. Have a read... Trinidad could REALLY do with fully autonomous vehicles...

Quote
Google's landmark deployment of autonomous cars — vehicles that can drive themselves — onto public roads and into live human traffic last summer and early fall was announced casually, and after the fact.

Astonishingly, the search-engine giant was able to set loose seven Toyota Prius hybrids, all adorned with a dizzying array of odd-looking sensors, onto Highway 1 between San Francisco and Los Angeles for several months without raising suspicion. Each vehicle was piloted by artificial-intelligence software designed to interpret the data collected by the sensors and use it to mimic the decisions made by a human driver. The goal: to fundamentally change the way we use cars.

How so, you ask? Google believes that the use of autonomous vehicles could nearly halve the number of automobile-related deaths — which it estimates at 1.2 million worldwide per year — because computers are theoretically more precise drivers than humans. In addition, the instant reaction time and 360-degree awareness of computer-controlled vehicles would allow them to ride closer together on the highway than vehicles driven by humans, thus reducing traffic congestion. And finally, they can be more fastidious with the accelerator, reducing fuel consumption and carbon emissions considerably.

Essentially, riding in an autonomous car could shave time off your daily commute, reduce your carbon footprint, save you money and save lives in the long run. And you don't even have to lift a finger. Instead of driving, you're a passenger — working, watching television, conversing with friends. Sounds idyllic.

Does this mean that self-directed robot cars, the kind that science-fiction writers have been dreaming about for decades, will hit the streets within a couple of years? No.

While the Google project may be one of the most high-profile demonstrations of autonomous-vehicle research, and one of the most successful to date, the path to a production robotic car still remains uncertain and would require clearing a staggering number of technical and legal hurdles. But plenty of people, in both the academic world and in the research-and-development divisions of carmakers such as GM and Volvo, are working out the kinks.


Read more: http://editorial.autos.msn.com/article.aspx?cp-documentid=1179150&icid=autos_0364&GT1=22022

44
So... read, then comment:
Quote
Study shows mobiles boost brain metabolism

Probably not a good thing
By Spencer Dalziel
Wed Feb 23 2011, 12:43
A STUDY HAS REVEALED that chatting on a mobile phone for long periods can cause a boost in brain metabolism.
The research reported in the Journal of the American Medical Association (JAMA) found that exposure to mobile phones does affect brain glucose metabolism, which is a marker of brain activity.
Before the great unwashed gets too excited about the idea of boosting brain metabolism, it's not actually a good thing. It doesn't translate to increased brain power. If anything, it could expose potential dangers of using mobile technology if the results concur with other recent studies into the health risks of using mobile phones.
The research team did a randomised crossover study with 47 mobile phone users. Two mobiles were put over either ear and an injection was given to measure the amount of brain glucose metabolism. The mobile over the right ear was muted for 50 minutes but turned on and the other mobile was turned off.
"The metabolism in the region closest to the antenna (orbitofrontal cortex and temporal pole) was significantly higher for on than off conditions," wrote the researchers.
But what does that increased brain glucose metabolism translate into? The problem is that the research team really doesn't know and won't extrapolate a hypothesis from the study to suggest that there is a significant health risk in the short term or long term.
But that's science for you. We're sure the traditional press will get hold of this study and draw whatever conclusions they want. We can already imagine their headlines. µ
http://www.theinquirer.net/inquirer/news/2028324/study-mobiles-boost-brain-metabolism

So the more you use it, the more glucose my brain metabolizes... sugar rush on a phone anymore.. :s

45
Hardware, Tweaking & Networking / Is your high end rig a high end cost?
« on: February 22, 2011, 12:55:58 PM »
We in T&T benefit from oil, gas and everything that comes with it, particularly a LARGE subsidy on fuel. We have some of the BEST power rates anywhere and as such we don't really think too much about the POWER our rigs are consuming whilst we bask in HD gaming glory, let's stop for a moment and really think about how much extra our rigs are adding to our lightbill and then think is it reasonable?

Tom's Hardware thought about it as well and here are the results of their findings:
http://www.tomshardware.co.uk/power-consumption-graphics-cards,review-32118.html

It's some pretty alarming figures, all you high end rigs out there have a read.

46
Wacky World of Weird News! / All your lightwaves r belong to us
« on: February 18, 2011, 03:03:55 PM »
This is straight out of science fiction so have a read.... I had instant visions of Tousen Bankai... have a read:

Quote

Less than a year after it was first suggested, the world’s first antilaser is here. A team of physicists have built a contraption that, instead of flashing bright beams, utterly extinguishes specific wavelengths of light.

Conventional lasers create intense beams of light by stimulating atoms to spit out a coherent beam of light in which all the light waves march in lockstep. The crests of one wave match the crests of all the others, and troughs match up with troughs.

The antilaser does the reverse: Two perfect beams of laser light go in, and are completely absorbed.

“There will be nothing coming out again,” said experimental physicist Hui Cao of Yale University, whose research group built the new device.

The device could find uses in fields from computing to medical imaging, the researchers report in the Feb. 18 Science.

Yale physicist A. Douglas Stone, a coauthor of the paper, first suggested the antilaser in a theoretical paper last July. Stone and colleagues had noticed that several other researchers had hinted at the idea of a laser that runs backward, and some problems in engineering called for a way to completely snuff out light. But no one had ever put the two ideas together.

“Others discovered independently that there’s an optimal condition where they can have the best absorption,” Cao said. “But they didn’t realize this was a time-reversed laser. They didn’t know they can get in principle perfect absorption.”

To build the antilaser, which Cao and colleagues call a “coherent perfect absorber,” the researchers split a beam from a titanium-sapphire laser in two. The laser emitted light in the infrared part of the electromagnetic spectrum, with longer wavelengths than the human eye can see.

Some of the light continued forward through the beam splitter, and the rest was forced into a sharp right turn. The physicists guided the light beams into a cavity containing a silicon wafer one micrometer thick. One beam entered from the left and one from the right. The distance each beam travels determined the way the crests and troughs of the light waves aligned when they met in the wafer.
Science/AAAS

When the alignment was right, the light waves canceled each other out. The silicon absorbed the light and converted it to another form of energy, like heat or electrical current.

“It is a simple experiment,” Cao said. “But it shows a very powerful way to control absorption.”

The device can only absorb one wavelength of light at a time, but that wavelength can be adjusted by changing the thickness of the wafer.

Surprisingly, the antilaser switched from absorbent to reflective when the researchers changed the way the waves met in the wafer. Under certain conditions, the silicon crystal actually helped light escape.

“That is a little surprising,” Cao said. “We can turn it on and off.”

Theoretically, 99.999 percent of the light can be extinguished. Because of the physical limitations of the laser and the silicon wafer, the antilaser only absorbed 99.4 percent of the light.

That may be good enough, Cao says.

“For many applications, if you already have less than 1 percent coming out, you’re already okay,” she said. “I’m sure people in the community who have better lasers than us, I’m sure they will achieve much more impressive results. This is only the first demonstration of the principle.”

The device may find uses in optical switches for future super-fast computer boards that use light instead of electrons. It may also have medical applications, such as imaging a tumor through normally opaque human tissue.

The most exciting applications will no doubt be the ones no one has thought of yet. The laser itself was called “a solution without a problem” when it first showed up.

“It is quite novel and indeed surprising that in such a mature field one can come up with something fundamentally new,” said physicist Marin Soljačić of MIT, who was not involved in the new work. “I think it opens a few exciting venues.”

http://arstechnica.com/science/news/2011/02/physicists-build-worlds-first-antilaser.ars

47
Mobile Phones & Gadgets / HP Veer and WebOS
« on: February 18, 2011, 02:50:02 PM »
I am NOT a smartphone junkie, I don't know what the full features of my BB are far less for Nokia N8 and Droid phones, however for those that do, you might appreciate this:





Quote
HP Veer first impressions
Good things come in small packages
By Lawrence Latif in Barcelona
Thu Feb 17 2011, 13:57

DECEPTIVELY SMALL is the impression we were left with after playing around with HP's Veer handset.

The small WebOS device, the HP Veer had its usefulness questioned by many, including us, but after using the diminutive smartphone there's no doubt that HP has managed to pull off a coup by bunging a good operating system into a small package. The Veer punches above its weight, both in terms of performance and usability, and for many its size will provide the ideal blend of smartphone and fashion accessory with a touch of bling.

The pebble shaped Veer is defined by its 2.6-inch touchscreen and its slide-out keyboard. The hardware keyboard is needed simply because the 2.6-inch screen is too small to type on. The keyboard isn't the biggest in the world, and compared to the HP Pre 3 the keys are packed in tightly, however it's far more usable than you would imagine from just looking at it.

While the screen itself has a relatively modest resolution by the standards of full sized smartphones, given the physical size of the screen, it offers reasonable pixel density. Unlike with HP's previous Pre and Pixi smartphones, finally HP has seen sense and put a glass covered screen on the Veer, Pre 3 and Touchpad. The difference is both seen visually and felt in use, making a better tactile impression.

Perhaps the only surprise is the thickness of the Veer. It is by no means fat, but including a hardware keyboard makes the Veer slightly chubby. The chap from HP who was giving the full sales spiel on WebOS said that people could carry the Veer around in their shirt pocket. While that is true, there is no doubt that the keyboard does add bulk to an otherwise well proportioned design.

However it isn't the hardware that makes HP's Veer shine, it's the splendid WebOS that makes the device worth considering. The unit we tested had WebOS 2.1.1 loaded and without doubt it was the best mobile operating system we encountered at Mobile World Congress, even surpassing Google's Android 3.0. It isn't the impressive visuals that make WebOS a delight, rather the intuative way that users can navigate throughout the operating system, almost waltzing their fingers on the screen and gesture pad. It's so easy to use it's as if the user's mind and not their fingers are directing the operating system.

To gain an appreciation for the Veer it needs to be compared with the Pre 3. Not only does the Veer make the Pre 3 seem oversized and cumbersome, but the fact that HP confirmed to The INQUIRER that WebOS on the Veer is not a cut down version makes the Veer all the more impressive. Aside from the change in screen resolution, there was no difference in applications such as the web browser or email client. We don't know whether or not third party applications will maintain feature parity between WebOS devices at this stage, but those applications do.

We were unable to test battery life in our relatively short test of the Veer. The device's size does mean it has a smaller battery than other smartphones, but the lower power draw of its 800MHz processor and 2.6-inch screen should compensate somewhat for that.

After HP unveiled the Veer we asked the question, just who would want such a device? After playing around with the Veer, it's still difficult to tell. Those with larger smartphones might find the small screen size daunting, however HP might gain some traction with those who want the occasional smartphone capability, such as reading email and every so often firing off a reply. The Veer isn't a 'party phone' but rather a diet smartphone, and that goes against the grain of what most phone manufacturers were showing at Mobile World Congress, which were devices that offer greater capabilities through larger screens and faster processors.

Despite our lingering doubts about the Veer's ability to sell in large numbers, HP has done a great job in creating a small, fast smartphone that has a good screen and keyboard. But above all, the Veer makes you realise just how excellent HP's WebOS really is. µ

Read more: http://www.theinquirer.net/inquirer/review/2027099/hp-veer-review#ixzz1EKz3LfP9
The Inquirer - Computer hardware news and downloads. Visit the download store today.

http://www.theinquirer.net/inquirer/review/2027099/hp-veer-review

48
Software, Security, Programming and Internet / LibreOffice is a GO!
« on: January 25, 2011, 02:28:44 PM »
RIP Open Office, LibreOffice the successor has come.

Quote
First release of LibreOffice arrives with improvements over OOo
By Ryan Paul | Last updated about 4 hours ago

The Document Foundation (TDF) has announced the availability of LibreOffice 3.3, the first official stable release of the open source office suite. It introduces a number of noteworthy new features and there are improvements throughout the included applications. More significantly, the release reflects the growing strength of the nascent LibreOffice project.

TDF was founded last year when a key group of OpenOffice.org (OOo) contributors decided to form an independent organization to develop a community-driven fork of OOo. The move was necessitated by Oracle's failure to address the governance problems that had plagued OOo under Sun's leadership, particularly the project's controversial copyright assignment policies. Oracle's acquisition of Sun and subsequent mismanagement of Sun's open source assets have created further uncertainty about the future of OOo and the sustainability of its community under Oracle's stewardship.

TDF got off to a good start and has attracted a lot of enthusiasm from former OOo contributors; Google, Red Hat, Canonical, and Novell are among its corporate supporters. The development effort so far has been reasonably productive. Contributors have been able to enhance LibreOffice with features that Sun had resisted accepting upstream, including parts of Novell's popular Go-OOo patch set. The LibreOffice developers have also incorporated significant improvements taken from the OpenOffice.org 3.3, which hasn't yet been officially released.

The new features included in LibreOffice 3.3 improve the office suite's feature set, usability, and interoperability with other formats. For example, it has improved support for importing documents from Lotus Word Pro and Microsoft Works. Another key new feature is the ability to import SVG content and edit SVG images in LibreOffice Draw.

Navigation features in Writer have been improved, the thesaurus got an overhaul, and the dialogs for printing and managing title pages got major updates. LibreOffice Calc touts better Excel interoperability and faster Excel file importing. The maximum size of a Calc spreadsheet has increased to 1 million rows.

In addition to delivering feature improvements, the LibreOffice developers have also focused heavily on code clean-up efforts with the hope of reducing legacy cruft, thus making the code easier to maintain and extend. Progress has been made, but the effort is still ongoing.

Due to the strong backing by the Linux community, the LibreOffice fork will likely be bundled in upcoming versions of several major Linux distributions. It's already planned for inclusion in Ubuntu 11.04, which is coming in April.

LibreOffice 3.3 is available to download from the project's official website, with support for Linux, Windows, and Mac OS X. The source code can be found in the official LibreOffice version control repository, which is hosted on FreeDesktop.org.
Taken from: http://arstechnica.com/open-source/news/2011/01/the-document-foundation-announces-first-release-of-libreoffice.ars

Download page: http://www.libreoffice.org/download/

49
Trading Grounds / F.S. Some Parts CPU, Case, Vid Card
« on: December 30, 2010, 01:30:58 PM »
AMD Athlon X2 4200: $200
http://products.amd.com/en-na/DesktopCPUDetail.aspx?id=59

Thermaltake Soprano: $650
http://www.thermaltakeusa.com/Product.aspx?S=1161&ID=1448

Diamond Radeon HD 4870 1GB: $600
http://www.techspot.com/review/113-radeon-4870-x2/
This was actually a HD 4870X2, however 1 GPU no longer functions. The card itself has been cleaned and the original thermal compound replaced with Artic Silver 5. Works perfectly as a single GPU card now.

Update

50
Handheld Gaming / Is your handheld a handheld or an iPhone?
« on: December 08, 2010, 05:11:50 PM »
I've not owned a handheld since the original Gameboy, with its spinach colours and 4 AA batteries, but I do have a smart phone (the non iOS kind). The following article raises an eyebrow more than once, if you're into handhelds many you should take a read.


Quote
Apple iOS chipping away at DS, PSP for handheld gaming crown
By Chris Foresman | Last updated about 2 hours ago

A new market research report shows that mobile phones—"particularly the iPhone"—make up a large and growing percentage of the handheld gaming market, while use of dedicated gaming devices like Nintendo's DS and Sony's PSP is slowly waning. With Apple's iPod touch outselling those other devices, plus the millions of iPhone and iPad sales, Apple's iOS is poised to become the number one mobile platform for games in the near future.

According to a new report by research firm Interpret, mobile phones are now responsible for about 44 percent of handheld gaming, up 53 percent of the last year. Use of a DS or PSP is down 13 percent over the same time period.

"The proliferation of highly multifunctional smartphones and messaging phones is a very real threat to the dominance by the DS and PSP of the handheld gaming market," Courtney Johnson, manager of research and analysis for Interpret, said in a statement. "Devices which satisfy a variety of entertainment and utility needs are fast outstripping single-function devices as consumer favorites."

That point is underscored by the fact that nearly a quarter of those that use a mobile phone exclusively for gaming have a DS or PSP but never use it.

Even among mobile phone platforms, however, the iPhone has attracted the most developer attention. id's John Carmack has been a proponent of Apple's iOS platform for gaming since 2008, noting that the iPhone 3G was "more powerful than a Nintendo DS and PSP combined." In a recent interview with Ars, he noted that id has ditched development on feature phones for iOS exclusively because the development process is so much more "pleasant."

"There's a vocal fraction of the consumer crowd on the iDevices that really wants the devices to be the successor to the PSP or DS—they want it to be a gaming machine," Carmack said. "You're somewhat hampered by the touch interface—there's a lot of places where tactile controls really are better—but you can definitely do a lot."

Wedbush Morgan Securities analyst Michael Pachter has also been bullish on iOS, particularly with respect to the iPod touch. For more than a year, he has argued that the iPod touch simply offers a better value proposition than a PSP or DS. While hardware pricing is somewhat similar, top-notch games cost in the range of $5-10 for iOS, while similar games run $20-30 for dedicated mobile gaming devices.

"Why would you pay $20 for Tetris when you can get it for $6.99 or $3.99 on iPod Touch?" he said late last year.

The analyst hasn't changed his tune, especially now that iPod touches are starting to outsell Nintendo and Sony hardware. "We're starting to see DS hardware sales crack," he noted on a recent episode of Pach Attack. "I think the ubiquity of the iPod Touch is cutting into the handheld market, I think the PSP was dead on arrival and I think the PSP2 is going to be dead on arrival."

Sony has made attempts to position the PSP Go as an alternative to iOS with little success. And the company is now—perhaps a little late in the game—exploring a mobile phone capable of playing PSP games. Nintendo, on the other hand, is integrating more hardware in its next-gen Nintendo 3DS, including gyroscope and motion sensors, cameras, and a 3D display; it will also be capable of playing 3D movies.

That new hardware might give a bit of a life line to the handheld market, but not for long. 'ltimately, I think handhelds are in trouble," Pachter said. "After the 3DS has had its little rush I think the handhelds will continue to decline."

51
The answer.... CHINA... study dat.

Quote
How China swallowed 15% of 'Net traffic for 18 minutes
By Nate Anderson | Last updated about 5 hours ago

In a 300+ page report (PDF) today, the US-China Economic and Security Review Commission provided the US Congress with a detailed overview of what's been happening in China—including a curious incident in which 15 percent of the world's Internet traffic suddenly passed through Chinese servers on the way to its destination.

Here's how the Commission describes the incident, which took place earlier this year:

    For about 18 minutes on April 8, 2010, China Telecom advertised erroneous network traffic routes that instructed US and other foreign Internet traffic to travel through Chinese servers. Other servers around the world quickly adopted these paths, routing all traffic to about 15 percent of the Internet’s destinations through servers located in China. This incident affected traffic to and from US government (‘‘.gov’’) and military (‘‘.mil’’) sites, including those for the Senate, the army, the navy, the marine corps, the air force, the office of secretary of Defense, the National Aeronautics and Space Administration, the Department of Commerce, the National Oceanic and Atmospheric Administration, and many others. Certain commercial websites were also affected, such as those for Dell, Yahoo!, Microsoft, and IBM.

The culprit here was "IP hijacking," a well-known routing problem in a worldwide system based largely on trust. Routers rely on the Border Gateway Protocol (BGP) to puzzle out the best route between two IP addresses; when one party advertises incorrect routing information, routers across the globe can be convinced to send traffic on geographically absurd paths.

This happened famously in 2008, when Pakistan blocked YouTube. The block was meant only for internal use, and it relied on new routing information that would send YouTube requests not to the company's servers but into a "black hole."

As we described the situation at the time, "this routing information escaped from Pakistan Telecom to its ISP PCCW in Hong Kong, which propagated the route to the rest of the world. So any packets for YouTube would end up in Pakistan Telecom's black hole instead." The mistake broke YouTube access from across much of the Internet.

The China situation appears to have a similar cause. The mistaken routing information came from IDC China Telecommunications, and it was then picked up by the huge China Telecom. As other routers around the world accepted the new information, they began funneling huge amounts of US traffic through Chinese servers, for 18 minutes.

As with many things involving cyberattacks and Internet security, it's hard to know if anything bad happened here. The entire thing could have been a simple mistake. Besides, Internet traffic isn't secure and already passes through many servers outside of one's control. Content that is sensitive but still suitable for the public Internet should be encrypted. Still, the Commission points out the many possible problems that such an IP hijack could cause.

    Although the Commission has no way to determine what, if anything, Chinese telecommunications firms did to the hijacked data, incidents of this nature could have a number of serious implications. This level of access could enable surveillance of specific users or sites. It could disrupt a data transaction and prevent a user from establishing a connection with a site. It could even allow a diversion of data to somewhere that the user did not intend (for example, to a ‘‘spoofed’’ site). Arbor Networks Chief Security Officer Danny McPherson has explained that the volume of affected data here could have been intended to conceal one targeted attack.

What about encryption?

    Perhaps most disconcertingly, as a result of the diffusion of Internet security certification authorities, control over diverted data could possibly allow a telecommunications firm to compromise the integrity of supposedly secure encrypted sessions.

The proliferation of certification authorities means that "untrustworthy" certification authorities are much harder to police, and there's speculation now that governments are involved in getting access to certificates in order to break encryption.

China has openly sought all sorts of encryption information for years, including the source code for routers, network intrusion systems, and firewalls. Those rules went into effect in May 2010, and they require foreign firms to submit this information to Chinese authorities before the government will purchase any such products.

But because the government review panels contain employees of rival Chinese firms, and because providing this information could make a company's worldwide products more susceptible to Chinese hacking or cyberattacks (which would in turn kill sales of said products in most countries), the Commission notes that no foreign firm has yet submitted to the new scheme.

http://arstechnica.com/security/news/2010/11/how-china-swallowed-15-of-net-traffic-for-18-minutes.ars

If you EVER doubted that China is a super power, let that sink in for a bit.

52
News / A grope or irradiation... decisions decisios...
« on: November 17, 2010, 05:22:39 PM »
For those that travel to the US often, the following article may be worth a read:
http://arstechnica.com/science/news/2010/11/fda-sidesteps-safety-concerns-over-tsa-body-scanners.ars
Quote
FDA sidesteps safety concerns over TSA body scanners
By Casey Johnston | Last updated a day ago

TSA Blog
The United States Transportation Security Administration has recently come under scrutiny for, among other things, its use of X-ray full-body scanners in airports to see through clothes and to detect non-metallic explosives. But are they safe? A group of UC-San Francisco professors recently raised a number of safety concerns regarding these scanners. While the Obama administration attempted to address these worries, its assertion that the scanners are safe appears to fall short.

The TSA has slowly been implementing the use of X-ray scanners in airports (so far, 38 airports have 206 of the machines) in order to see through passengers' clothes and check them for explosive devices. Officials have asserted that the machines are okay to use on the basis of the everyday use of X-rays in medical offices. However, a group of four UCSF professors pinpointed several important differences between the medical X-ray machines and those used in airports. They described the issues in a letter to Dr. John P. Holdren, the assistant to the president for science and technology.

A normal X-ray image is a familiar sight—depending on the exposure, an X-rayed person typically appears only as a skeleton. This is because the X-rays used in those machines penetrate the skin and can only be absorbed by bone.

Unlike a medical X-ray, the TSA X-ray machines are a sci-fi fan's dream: they are lower-energy beams that can only penetrate clothing and the topmost layers of skin. This provides TSA agents with a view that would expose any explosives concealed by clothing. But according to the UCSF professors, the low-energy rays do a "Compton scatter" off tissue layers just under the skin, possibly exposing some vital areas and leaving the tissues at risk of mutation.

When an X-ray Compton scatters, it doesn't shift an electron to a higher energy level; instead, it hits the electron hard enough to dislodge it from its atom. The authors note that this process is "likely breaking bonds," which could cause mutations in cells and raise the risk of cancer.

Because the X-rays only make it just under the skin's surface, the total volume of tissue responsible for absorbing the radiation is fairly small. The professors point out that many body parts that are particularly susceptible to cancer are just under the surface, such as breast tissue and testicles. They are also concerned with those over 65, as well as children, being exposed to the X-rays.

The professors pointed to a number of other issues, including the possibility that TSA agents may scan certain areas more slowly (for example, the groin, to prevent another "underwear bomber" incident like the one in December 2009), exposing that area to even more radiation. But the letter never explicitly accuses the machines of being dangerous; rather, the professors encourage Dr. Holdren to pursue testing to make sure that the casual use of these X-rays is safe.

Dr. Holdren passed the letter on to the Food and Drug Administration for review. But, in the FDA's response, the agency gave the issues little more than a data-driven brush off. They cite five studies in response to the professors' request for independent verification of the safety of these X-rays; however, three are more than a decade old, and none of them deal specifically with the low-energy X-rays the professors are concerned about. The letter also doesn't mention the FDA's own classification of X-rays as carcinogens in 2005.

The letter concludes that "the potential health risks from a full-body screening with a general-use X-ray security system are minuscule." But the increased surface area and volume of absorption area, plus the frequency with which many people travel, suggests that this use at least bears further scrutiny. US pilots' associations have also encouraged their members to opt for the pat-down in the meantime.

Of course, these pat-downs have recently become rather invasive, so now travelers must choose between a little irradiation and being felt up by a non-doctor.

However, the TSA does have a potential solution in hand. Of the 68 airports scanning for explosives, 30 are using millimeter-wave scanners that don't use X-rays at all; they hit the surface of the body with safer radio waves. If the TSA committed to using only this type of equipment, it could avoid the safety concerns regarding the X-ray full body scanners completely.

*W1nTry tries to remember if he ever walked thru those darn things*

53
Ole Talk / 16 Awesome Anti-theft devices
« on: September 30, 2010, 01:26:49 PM »
Check these out: http://www.walyou.com/blog/2009/12/29/anti-theft-gadgets/

Excerpt:
Quote
5. The Anti Theft Plug Mug



This innovative Anti Theft Mug is a cool way to keep your work colleagues from using your mug while you are away or on a sick day.

I am certain there are more than a few out there who get extremely angry when they find their personal mugs and dishes have been used by others in the office. It is a sense of a personal space being invaded and should not be tolerated. For that, the plug mug grants a perfect way to keep your mug at work and know ahead of time that others cannot use it to fulfill their needs and habits.

By a simple ‘plug’ designed in the mug, the owner may remove the plug when deemed needed, and therefor the mug itself becomes useless. That is, unless someone will use their fingers to keep the mug hole filled so liquids won’t spill out.

54
Trading Grounds / F.S. Sony Cybershot DSC-S700 7.2 MP
« on: July 22, 2010, 04:57:58 PM »
Sony Cybershot DSC-S700 7.2 MP
Product Features and Technical Details
Product Features

    * 7.2-megapixel CCD captures enough detail for photo-quality 14 x 19-inch prints
    * 2.4-inch LCD display; 3x optical zoom
    * High-sensitivity shooting mode increases ISO to maximum of 1000
    * Capture 320 x 240 video at 30 fps
    * Stores images on Memory Stick DUO or MS Pro DUO memory cards (24 MB internal memory included)

Technical Details

    * Sensor type: 1/2.5-inch CCD
    * Effective pixels: 7.1 million
    * Maximum resolution: 3072 x 2304
    * Image ratio: 4:3, 3:2, 16:9
    * ISO rating: Auto, 100, 200, 400, 800, 1000
    * Optical zoom : 3x
    * Digital zoom : Yes
    * Image stabilization: No
    * Auto focus: Yes
    * Manual focus: No
    * Auto focus type: TTL
    * Focus range: 35cm - 5cm
    * White balance override: 5 positions
    * Aperture range: f2.8 - f4.8
    * Shutter speed: 1 - 1/2000 seconds
    * Built-in flash : Yes
    * Flash modes: Auto, Red-Eye reduction, On, Off, Slow Sync
    * Exposure compensation: -2 to +2 EV in 1/3 EV Steps
    * Metering: Multi-Segment, Spot
    * Continuous drive: Yes, 0.7 fps up to 3 images
    * Movie clips: Yes, 320 x 240 at 30 fps
    * Self timer: 10 seconds
    * Storage types: Memory Stick Duo / Pro Duo + Internal
    * Storage included: 24 MB
    * Compressed format: JPEG (EXIF 2.2)
    * Quality levels: Fine, Standard
    * Viewfinder: Optical
    * LCD : 2.4 inches
    * LCD pixels: 112000
    * Video out: Yes
    * USB: Yes
    * Firewire: No
    * Battery type: AA batteries (2)
    * Dimensions: 3.6 x 2.4 x 1.1 inches
    * Weight: 6.3 ounces

1GB Sandisk Pro Duo memory card

Mint Condition

No case

Comes with original packaging, data cable, cd, etc.

Batteries no included

Price:$600

55
News / The sky is shrinking
« on: July 21, 2010, 12:32:26 PM »
Kinda scary... 2012 much?

Quote
The sky is not falling, but it is shrinking
By Matt Ford | Last updated about 2 hours ago

Scientists have discovered yet another enigma about our planet: the thermosphere has undergone serious shrinkage. The thermosphere is the largest portion of the Earth's atmosphere and is the next-to-last region before you reach the vacuum of outer space. The fact that it has contracted is not surprising; the thermosphere absorbs extreme ultraviolet (EUV) photons from the sun and warms and cools—expanding and contracting—in a pattern that follows the 11-year solar cycle. While we are coming out of one of the longer periods of low solar activity in a century, scientists have found that the thermosphere has shrunk some 28 percent. That's the largest drop in recorded history, and they cannot explain why.

Solar cycle 23 (the previous one) was unusually long—12.4 years—and the minimum between it and cycle 24 had the most days without sunspots since 1933, both of which will result in a cooled thermosphere. CO2 in the lower thermosphere is the dominant cooling agent, so increased concentration of CO2 will lead to a cooler, more contracted thermosphere. The cooling process is accelerated during a solar minimum as well, causing the entire system to be very complex and difficult to fully describe.

To understand what is going on, scientists from Naval Research Lab in Washington, DC and George Mason University have taken a look at how the incoming solar irradiance has affected the mass density of the outer reaches of our atmosphere.

We have a long history of accurate measurements of atomic density at a variety of altitudes in the thermosphere. This data is derived from measuring the drag on various spacecraft and satellites, and that gives us over four decades of detailed measurements. EUV photons have been directly measured only since the launch of the TIMED/SEE instrument in 2002, so a proxy for EUV irradiance must be used. The only reliable proxy that has been around for long enough is the continuous observation of the 10.7 cm solar radio flux. While it is not a perfect indicator, it is stable and well calibrated.

Using the global-average density data at 400km between 1967 and January 2010, the researchers found a low in 2008 where the mass density was "unequivocally lower than at any time in this historical record." The authors add that the 2008 minimum was the lowest since the beginning of the space age, 1957, when measurements of this type became possible.

The thermosphere density was a full 28 percent lower during the cycle 23/24 minimum than it was during the cycle 22/23 minimum. This is much larger than the decrease expected from the long-term trend seen over the past few cycles (that's six percent). In contrast, the solar radio flux was down only 3.7 percent between cycles.

What could drive such a large change? The authors examined the temperature profile of an arbitrary atmospheric column. Using a model that describes the temperature of the thermosphere as a function of altitude, they tweaked parameters in order to fit the data that has been seen in the recent contraction. They found that the temperature of the exosphere must be 14K lower, and that the levels of atomic oxygen at 120km must be 12 percent lower and other atomic species must be three percent lower than normal.

These results still don't explain why any of this happened, only what is needed for the model to fit.

So, the authors look at known factors and how strongly they can influence the thermosphere's mass density. The decrease in the solar radio flux is capable of explaining about one-third of the observed contraction. Another sixth or so can be explained by the elevated levels of CO2 in the atmosphere, which radiatively cooled the thermosphere. However, known mechanisms stop there, leaving over half of the decrease in mass density unaccounted for.

The authors suggest a few possibilities. First, the relationship between the observed solar radio flux and the actual amount of EUV radiation reaching Earth may have changed drastically in the past few years. This would make the proxy measurement invalid, but there is no experimental support for it—it would have to reflect some undescribed solar phenomenon.

The other possibility they consider is that changes in the chemical makeup and dynamical processes in mesosphere and lower thermosphere affect the concentration of atomic oxygen at the lower boundary. The authors point out that such internal processes, coupled with known anthropogenic changes, could produce the missing 50 percent of the change that's unaccounted for.

The latter could represent an ominous change. As the authors themselves put it, "If changes in the radiative properties of the MLT [mesosphere and lower thermosphere] are responsible for the temperature and composition changes of the upper thermosphere, then the density anomalies may signify that an as yet unidentified climatological tipping point, involving energy balance and chemistry feedbacks, has been reached."

Before any hard conclusions are reached, the authors point out that the thermosphere's recovery as we climb out of this solar minimum needs to be monitored carefully. And we'll want to see what happens in 11 years, during the next solar minimum.

Geophysical Research Letters, 2010. DOI: 10.1029/2010GL043671

http://arstechnica.com/science/news/2010/07/the-sky-is-not-falling-but-it-is-shrinking.ars

56
General Gaming & System Wars / Alice: Madness Returns
« on: July 21, 2010, 12:22:26 PM »
Yes folks, a sequel is in the books....  :cowboy:

Quote
Alice sequel coming in 2011, first screens inside
By Ben Kuchera | Last updated about 18 hours ago

It has been talked about for quite some time, but EA has finally shown off some assets from the sequel to the cult classic American McGee's Alice, called Alice: Madness Returns. At a press event earlier today, EA described the game as an action adventure, with the very broad release window of 2011.

    Alice: Madness Returns takes place 10 years after the conclusion of the original game, with Alice struggling to recover from the emotional trauma of losing her entire family in a fatal fire. After spending a decade institutionalized in an insane asylum, she is finally released to the care of a psychiatrist who just may be able to help her conquer the nightmarish hallucinations that still haunt her. Alice embarks on a mission to root out the true cause of her family's mysterious death, jumping from a gloomy and stark London to a rich and provocative Wonderland.

There is a teaser trailer, some screenshots, an official site, and that's pretty much it. This is certainly a case of a creative team taking something that was already dark and twisted and making it slightly more dark and twisted, but the original game is still talked about fondly. Check out the official site for a brief trailer, and you can gaze upon a few screenshots to make the wait a little more tolerable.


http://arstechnica.com/gaming/news/2010/07/alice-sequel-coming-in-2011-first-screens-inside.ars

Click the link, check the screenshots (they look promising) and check out the trailer:

http://www.ea.com/alice/videos/teaser-trailer

PS... be disturbed, be VERY VERY DISTURBED....

57
Processors / Intel silently launches Hex-core Core i7 970
« on: July 19, 2010, 11:01:07 AM »
For those with some dosh to spend (but not as much as the Extemely expensive Ed.) this WILL interest you:
Quote
Intel six-core Core i7-970 is spotted
Still not official
By Spencer Dalziel
Mon Jul 19 2010, 14:12

INTEL HAS STEALTH RELEASED a six-core 3.2GHZ Core i7-970 that's already up for grabs online.

Intel has launched a couple of processor upgrades recently without so much as any flag waving so we weren't too surprised to see another addition hit retail.

Pop along to Newegg and the website has the processor listed for Ł589 with free shipping and available to purchase now. It's much cheaper than the Core I7-980 that starts at around Ł800 but comes with much the same features.

The Intel six-core Core i7-970 is based on the same 32nm Gulftown build as Intel's other Core i7 processors but has six CPUs on the chip. It also has a 12MB L3 cache and a triple-channel memory controller. Punters and tweakers will also get more oomph from the chip with Turbo Boost.

Intel hasn't made this official yet so it will probably unveil its roadmap when it's already late. µ

Original link: http://www.theinquirer.net/inquirer/news/1723312/intel-core-core-i7-970-spotted

Newegg link:http://www.newegg.com/Product/Product.aspx?Item=N82E16819115066&Tpk=core%20i7%20970

58
Hardware, Tweaking & Networking / Quake on a heldheld...
« on: May 05, 2010, 04:43:01 PM »
We really are progressing gents and ladies...

Go to the site and check out the video:

http://www.theinquirer.net/inquirer/news/1604220/intel-moorestown-chip-demo-video


Well done Intel

59
Console Gaming Archive / Top 10 consoles of ALL time
« on: April 19, 2010, 09:50:05 PM »
This will interest many and confuse many as well... I don't make the call I just post it:
Quote
Top 10 games consoles of all time
Hardware to hammer for your gaming fun
David Neal
V3.co.uk, 17 Apr 2010
Gaming consoles hit the streets in the mid-1970s, and have developed alongside the personal computer ever since.

In previous Top 10s for computer, console and arcade games, many readers asked for a list of consoles.

All lists are subjective. I don't care. This is my list of the top 10 consoles, not yours. It is based on personal experience, games played and hours lost.

10. PlayStation 3
A sleek, black Blu-ray player with a 120GB hard drive and rechargeable controllers. What's not to like? It may have stuttered with its Home project, and could fail to set anything alight with its Move controllers - except the knob at the end of them - but the PS3 is a good console. It's a shame it looks destined to live in its little brother's shadow.

9. PlayStation 2
Although the Grand Theft Auto series appeared on its earlier brother, the game made its bones on the PlayStation 2. GTA San Andreas and Vice City are still the benchmarks by which all other free-roaming/mission-based crime sagas are judged, and the PS2 is equally well remembered. A massively hyped release, the PS2 impressed straight out of the box, bringing with it a slew of quality games.

8. Neo Geo
Released by SNK in the early 1990s to games fans with a lot of leisure time and money, the Neo Geo offered far superior graphics to its 16-bit counterparts, giving the much-sought after 'almost-arcade' quality to home gaming. Much pricier than the competition, the Neo Geo was to be played at other people's houses when King of Fighters 4 was the finger-callus-causer of choice.

7. Nintendo Gameboy
Released in 1990 and exploding onto the world with its monochrome falling blocks game called Tetris, the Gameboy was a bulky two-button affair. Boring to look at, it took on, and smashed to pieces, the flashier Sega Gamegear, which had a colour screen and hardly any decent games.

6. Super Nintendo System
Street Fighter 2, Mario Kart. Shall I continue? Like the MegaDrive the SNES arrived at a time when today's 30-something big spenders were finding their bones on home consoles. Often, like a single duvet, accompanying them to university, until they got a double bed, and the PlayStation was released. A classic games machine, and the biggest seller of the 16-bit lot.

5. Nintendo Wii
Needs no introduction. Even grandmothers know what the Wii is. They may swing the nunchuck around wildly, but the idea, that the game follows the motions you make in real life, is a lot easier to grasp than the alternatives. Just remember to use the wrist strap. The Wii marries gaming with family fun. Everyone should have one.

4. Nintendo DS
When even your mum has a games console, albeit a handheld one, you know it is worth paying attention to. The puzzle happy, e-book wannabee, dual touch-screen, brain training DS is a phenomenon. It has sold over 125 million units. In terms of sales, it is the Dan Brown of consoles, mind-bending and ubiquitous.

3. Xbox 360
Should be number two really, except for one thing: why does it have to be so loud? Having an Xbox is like having a hamster in a ball constantly scuttling about on a wooden floor. Only instead of a hamster it's a cow chasing an elephant around in a speaker. The Xbox 360 has great games and by far the best online experience, with Live making the PlayStation and Wii equivalents looks dead. Not to mention the Halo series, which is the jam in Live's exquisite doughnut.

2. PlayStation 1
It is difficult to explain just how exciting the release of the PlayStation was. If you weren't there and didn't play time-consuming titles like Wipeout 2097 or the Tony Hawks and Resident Evil series, not to mention Parappa the Rapper, which is the best game ever, you might as well just move on. And I haven't even mentioned Tekken, Battle Arena Toshinden or Ridge.

1. Sega Mega Drive
Sonic 3D, Mortal Kombat, Fifa 94. I could go on, but it would start to seem that I have classic games Tourette's. Given to me one Christmas, the Mega Drive was my teenage years. I should probably hate it, but instead I still play it. Take that, first girlfriend!




http://www.v3.co.uk/v3/news/2261522/top-consoles

I do consider his choices questionable but I do like the fact that the 360 is pretty high up *cough above PS3 cough*. The man I want to hear from about this is Shiv


60
News / You think your ISP is fast.. pfft
« on: April 14, 2010, 02:40:38 PM »
...... that all I have to say about this:

Quote
Who's #1 in broadband? 1Gbps fiber for $26 in Hong Kong
By Nate Anderson | Last updated about 3 hours ago

According to people like Ivan Seidenberg, Verizon's CEO, the US is number one in broadband, no question about it. But one only has to look around the world to see just how specious such claims are.
City Telecom's ad for its 1Gbps service

Take Hong Kong as an example. City Telecom made waves a few months ago with its US$13, symmetric 100Mbps connections. Today, the company slashed prices on its much faster 1Gbps fiber-to-the-home offering; a fully symmetric, 1Gbps connection costs HK$199... or US$26 a month.

Want phone service with that? That'll be US$3. IPTV service will cost another $6-12, depending on the channel package. (There's also a US$115 installation charge to run the fiber link from the building basement up to an individual apartment.)

This is an exceptional offer, but City Telecom isn't alone in offering service that should make US operators cringe—and US customers green with envy. Hutchson Telecom offers 100Mbps symmetric connections for US$13. i-Cable offers 130Mbps downloads for $39 per month using DOCSIS 3.0 tech.

This isn't the US market, so prices aren't directly comparable, but Hong Kong and the US are almost identical when it comes to GDP per capita, adjusted for purchasing power parity (PPP).

Hong Kong is one of the densest spots on earth. One wouldn't expect to see this level of price and competition across a country as broad and sprawling as the US, but one would expect it to be possible somewhere. Sadly, even something like 100Mbps is hard to come by in most US cities; 1Gbps is unknown, except to tiny specialty operators, even in a place like New York City.

City Telecom took out a full-page ad in the South China Morning Post today, advertising its new offering with the tagline, "1000M: Transform your life."

http://arstechnica.com/tech-policy/news/2010/04/1gbps-symmetric-fiber-us26-in-hong-kong.ars

Pages: 1 2 [3] 4 5 ... 53

SimplePortal 2.3.3 © 2008-2010, SimplePortal