Saturday, October 20, 2007

Early humans knew their way around makeup, tools and shellfish : Key Human Traits Tied to Shellfish Remains


Small stone blades and a reddish body pigment recently discovered in a cave near the southern tip of South Africa suggest that the use of symbolism and tools -- hallmarks of modern human behavior -- had already begun to develop 164,000 years ago, far earlier than previously believed, researchers report.


Arizona researchers also found in the cave the earliest evidence of seafood consumption. The earliest previous evidence of the consumption of shellfish was dated to 125,000 years ago, and the oldest stone tools, known as bladelets, were 70,000 years old.



Researchers know that modern humans evolved in Africa between 100,000 and 200,000 years ago, but there has been little archeological evidence available to pin down a timeline.


Evidence along the coastlines has been particularly scarce because rising ocean waters after the end of the last glacial period obliterated most sites.


The new evidence comes from a cave at Pinnacle Point on Mossel Bay, about halfway between Cape Town and Port Elizabeth. The cave would have been two to three miles inland at the time it was occupied, high enough above sea level to be safe from flooding.


A team headed by anthropologist Curtis Marean of Arizona State University's Institute of Human Origins reported Thursday in the journal Nature that it found shells from cooked seafood in the cave, including brown and black mussels, small saltwater clams, sea snails and even a barnacle, which is typically found on whale blubber or skin.


Seafood was the last item added to the diet of humans before they began to domesticate animals and grow their own food. Marean said the ancient humans brought the shellfish to the cave and cooked it over hot rocks, which caused it to pop open. When Marean's team replicated the process, they found the food tasty but a little dry, he said.


The small bladelets, about the size of a pinky finger, are thought to have been attached to spears, often in multiples, providing an advantage over hand-held tools.


The red ocher pigment, found in the form of 57 lumps of hematite collected by the cave dwellers, would have been used for decorating their bodies and coloring artifacts. Such decoration is thought to be an early manifestation of symbolism. The decoration was used to convey messages to other peoples living in the vicinity.


Africa was cold and very dry during this period, and Marean and his colleagues speculated that the shellfish were a kind of "starvation food" the early humans turned to when there was little else available.


Evidence indicates there were only a handful of places in Africa where humans could have survived during this glacial period. Mossel Bay may have been one of them, they said.


"It is possible that this population could be the progenitor population for all modern humans," they wrote.
--------------------------------------------------------------------------------
Key Human Traits Tied to Shellfish Remains
--------------------------------------------------------------------------------



Almost from the start, it seems, humans headed for the shore. But this was no holiday for them. More than likely, it was a matter of survival at a perilous time of climate change in Africa 164,000 years ago.



By then Homo sapiens had developed a taste for shellfish - much earlier than previously thought, scientists report in today's issue of the journal Nature - as the species was adapting to life in caves on the craggy coast of southern Africa.


Exploring a cave in a steep cliff overlooking the ocean, an international team of scientists found deposits of shellfish remains, hearths, small stone blades and fragments of hematite, some of which, the scientists believe, had been ground for use as the coloring agent red ochre that sometimes had symbolic meaning.


"The shellfish," the researchers concluded, "may have been crucial to the survival of these early humans as they expanded their home ranges" in response to the cooler and drier conditions that had prevailed for thousands of years in the interior of Africa.


Curtis W. Marean, the team leader and a paleoanthropologist with the Institute of Human Origins at Arizona State University, said, "Shellfish was one of the last additions to the human diet before domesticated plants and animals were introduced," more than 10,000 years ago.


In an accompanying article, Sally McBrearty of the University of Connecticut and Chris Stringer of the Natural History Museum in London, who were not involved in the research, said the find provided "strong evidence that early humans displayed key elements of modern behavior" as early as 164,000 years ago.


The discovery was made in a cave at Pinnacle Point near Mossel Bay on the southern coast of South Africa, about 200 miles east of Cape Town.


Previous research had indicated that human ancestors had for ages depended solely on terrestrial plants and animals. Both fossil and genetic data show that modern humans evolved 150,000 to 200,000 years ago, but archaeological evidence for the emergence of modern behavior in technology, creativity, symbolic thinking and lifestyles is sparse.


But six years ago, at Blombos Cave, near Pinnacle Point, archaeologists uncovered 77,000-year-old tools along with pigments and engraved stones suggesting symbolic behavior, a sign of early creativity. Now, at the Pinnacle Point cave site, the shellfish remains reveal another important innovation.


Other coastal populations had been found exploiting marine resources as early as 125,000 years ago. Neanderthals were cooking shellfish in Italy about 110,000 years ago.


The presence of red ochre at Pinnacle Point, Dr. Marean's team also reported, indicated that at this time humans already "inhabited a cognitive world enriched by symbols." The researchers said the material had both symbolic and utilitarian functions and was probably used for body painting and for coloring artifacts.


Until recently, anthropologists generally assumed that modern human behavior arose much more recently, probably around 45,000 years ago, as a consequence of some unidentified change in brain function that favored communication and symbolic thinking to express social status and group identity. This interpretation was based on the apparently sudden appearance of art and self-adornment at sites in Europe.


The search for early human use of marine resources, supported by the National Science Foundation, centered on the cave at Pinnacle Point because of its position high on a cliff. Other seashore sites of early human occupation had been inundated by the rise in sea level, beginning about 115,000 years ago at the end of Africa's long arid conditions.


Forced to seek new sources of food, some of the people migrated to the shore in search of "famine food." At Pinnacle Point, the discovery team reported, they feasted on a variety of marine life, brown mussels, giant periwinkles and whelks.


So on the southern shore of Africa, Dr. Marean said in a statement issued by Arizona State, a small population of cave-dwelling modern humans struggled and survived through the prevailing cold, eating shellfish and developing somewhat advanced technologies.


"It is possible," he concluded, "that this population could be the progenitor population for all modern humans."






Technorati :

Google Gets Undue Credit for Ad Conversions :CORRECTED-AQuantive exec leads Microsoft charge into Web ads


Google undeservedly receives credit for many clicks on the online ads it delivers via its search engine, but Microsoft wants to put a stop to that.Google Inc. has gotten undeservedly all the credit for many clicks on the online ads it delivers via its search engine, but Microsoft Corp. wants to put a stop to that.


So said Brian McAndrews, senior vice president of Microsoft's Advertiser Publisher Solutions Group during a panel discussion at the Web 2.0 Summit in San Francisco Thursday.


Currently, systems for tracking ad conversions and analyzing online marketing campaigns focus on the last ad a user viewed or clicked on, he said. This gives all credit to that last publisher and not to others the user may have been at before and influenced the user to seek more information about the advertiser, McAndrews said.


In particular, this situation has unfairly benefitted Google because many times someone will see a display ad on a site and go to Google, search for the vendor's name, and then click on the vendor's text ad served by Google, he said.


But Microsoft is developing a technology called "conversion attribution" that will track the trail of ads seen by a user, so that advertisers get a more complete understanding of how effective their marketing campaigns, he said.


Along the way, advertisers will get a more balanced view of the value of their ads across a wider trail of Web sites and via a variety of ad formats, not just the last ad displayed by the last publisher, which is often Google, he said.


"We'll introduce conversion attribution to give [more publishers] credit and it will devalue search [advertising]," McAndrews said.


Search advertising is the largest online ad format, accounting for about 40 percent of total ad spend. Google has built its empire on these pay-per-click ads, which the company matches to the content of queries on its search engine and to the content of third-party Web sites on its ad network.


While search has been the main driver of the blistering growth of online advertising in the past five years, that won't be the case in the coming five years, McAndrews said.


In addition to the "conversion attribution" technology, the shift away from search ads will be fueled by the increased spending in online ads from large companies which prefer display and rich media advertising designed to boost their brands, and for which pay-per-click text ads are less effective, he said.


Google didn't have any representatives participating in the panel. The company didn't immediately respond to a request for comment.


Microsoft didn't immediately respond to a request for clarification on the availability of the "conversion attribution" technology.


McAndrews, who came to Microsoft recently via its US$6 billion of aQuantive Inc., of which he was the CEO, shared the stage with other ad executives in a panel titled "Edge: The Advertising Model" moderated by conference chair John Battelle.




CORRECTED-AQuantive exec leads Microsoft charge into Web ads




SAN FRANCISCO, Oct 19 (Reuters) - Microsoft Corp (MSFT.O: Quote, Profile, Research) said its $6 billion acquisition of aQuantive will fill out its Internet advertising technology and make it as competitive as Google Inc (GOOG.O: Quote, Profile, Research) in that market.


The August purchase of the digital advertising firm positions Microsoft as one of the few companies with the money and technology know-how to be a major force in an industry expected to grow to $80 billion by 2010, said Brian McAndrews, former aQuantive CEO who is leading Microsoft's advertising efforts.



"Our goal is to have a significant amount of the revenue in the industry going through our technology, our platform," said McAndrews, who spoke to Reuters this week on the sidelines of the Web 2.0 Summit in San Francisco. "All parts of our business will be bigger and more profitable."


AQuantive's Atlas technology allows advertisers and publishers to buy and sell digital advertisements.


Microsoft still needs to be able to sell "contextual" advertising to target specific visitors of a Web site, but McAndrews expects that it can introduce that capability next year, providing a full suite of advertising tools.


McAndrews is at the vortex of Microsoft's major push into Web advertising technology. Threatened by Google, the company is working to build a major business to be a pillar of earnings alongside its main Windows and Office software. Microsoft made its biggest-ever acquisition when it bought aQuantive and handed the keys to its advertising business to McAndrews. The executive ranked No. 11 in Business 2.0 magazine's "50 Who Matter Now," ahead of even his Microsoft bosses, who include CEO Steve Ballmer and Chairman Bill Gates.



'1 + 1 = 3'


McAndrews, who is senior vice president at Microsoft's advertiser and publisher services group, said the benefits of the acquisition would increase over time.



"Absolutely, one plus one equals three, but not on day one," he said. "I would say something like one plus one equals two and a quarter."


Microsoft's acquisition of aQuantive is part of a $10 billion consolidation spree in the online advertising market, centered mostly around deep-pocketed players. Google agreed to pay $3.1 billion for DoubleClick, while Yahoo Inc (YHOO.O: Quote, Profile, Research) snatched up the rest of Right Media for $680 million.


"There just aren't going to be very many companies that can invest the kind of money it's going to take to build these platforms and have the technology expertise," said McAndrews.


For now, AQuantive's Avenue A/Razorfish interactive ad agency will remain in Microsoft. Wall Street analysts had earlier speculated the lower-margin unit, which accounts for nearly 60 percent of aQuantive's revenue, may be sold


McAndrews explained that Avenue A had also been a key link between its advertiser clients and the part of aQuantive that specializes in ad-serving and tracking technologies. AQuantive has also profited from its technology ties with traditional ad agencies.


"Agencies are a critical part of the ecosystem," McAndrews said. "They aren't going away, they shouldn't go away and, frankly, if there is a technology company trying to disintermediate them, then I think that's a mistake."


Google has sought to extend its auction-based technology for efficiently selling Web search advertising to traditional media, from television to radio and print. While the company said it did not plan to replicate the ad agency model, many in the industry fear it might eventually offer a wider range of advertising services, including creating campaigns.








Technorati : ,

In Search of Wireless Wiggle Room



By Judith Chevalier (Judith Chevalier is a professor of economics and finance at the Yale School of Management.)


I RECENTLY watched a YouTube clip of a young man removing the memory chip from his iPhone with his teeth, in an attempt to "unlock" the device for use on a network other than the AT&T system for which the phone was exclusively sold. His gyrations were a particularly vivid reminder of the limits imposed on cellphones by the companies that run national wireless networks in the United States



But there are signs that the existing order in the wireless world may finally be changing.


This month, despite the opposition of companies like Verizon, the Federal Communications Commission reiterated that its coming auction of wireless spectrum would include rules intended to give consumers more choices in the phones they can use. These "open access" auction rules have already received considerable attention, but the commission also faces other decisions that will significantly affect the availability and the price of wireless service.


For decades, the F.C.C. has allocated almost all the prime frequencies available for communication for the exclusive use of licensees and since the 1990s, it has emphasized auctions for allocating that spectrum. Economists mostly applauded the move to the auction system, because it subjected prospective licenses to a market test: the F.C.C. no longer had to try to divine who would make the most productive use of the spectrum; the companies that were willing to pay the most at auction would win.


Not all of the spectrum is auctioned, however. Some of it is left "unlicensed" - meaning that no one has exclusive rights to it. Baby monitors, garage door openers and Wi-Fi all work on unlicensed parts of the spectrum, in a kind of free-for-all. As technology improves to prevent these devices from stumbling over one another's signals, the argument for leaving part of the spectrum unlicensed grows stronger as well.


In the past, when the F.C.C. auctioned spectrum for cellular service, it allowed the winners to determine the equipment and applications that would run on their networks. That created the current status quo, in which a vast majority of American consumers buy a handset from a wireless service provider.


The open-access rules, which will apply to about one-third of the spectrum being sold at the auction, represent a significant departure from past practice. They require the winners to let consumers use any tested, safe and compatible device or application on its network. Entrepreneurs could sell handsets with capabilities that are unavailable - or unavailable at affordable prices - from current carriers. As long as a manufacturer offered a product compatible with a network, the consumer could use that product - without needing to take measures like unlocking it with his teeth.


The spectrum in the coming auction is low frequency, which means that a signal using it could travel long distances and penetrate walls, making it very appealing for a national wireless network. Because the signal travels well, for example, fewer towers would be needed. Thus, it is possible that a new network could emerge from the auction to compete with the four existing national carriers.


Although the open-access rules could lead to innovation in devices and applications, they might not do much to increase competition among carriers. Preventing an influx of additional competition would be valuable for the big existing carriers, so they may bid aggressively enough that no new carriers will win significant licenses at the auction.


Both Google and Frontline Wireless - a start-up whose vice chairman is Reed E. Hundt, a former F.C.C. chairman - lobbied the commission to effectively force new entry into the market by imposing added restrictions on auction winners. They argued that winners should be forced to resell use of the spectrum at wholesale rates, a provision the F.C.C. did not adopt. The idea was that this spectrum could support multiple services, creating competition and driving down prices.


Having missed the opportunity to include these provisions in the coming auction, the F.C.C. will have another chance this year to create cheaper wireless broadband services. Google and other technology companies, including Dell, Philips and Microsoft, are part of a group called the White Space Coalition that is asking the F.C.C. to open up the empty space between assigned TV channels to unlicensed users and devices.


The idea would work like this: In many areas, not all broadcast channels are in use. The unused channels are "white spaces" of high-quality spectrum that could be made available to local Internet service providers. Unlike the much higher frequency of Wi-Fi, television broadcast frequencies can travel for miles and penetrate walls, providing a much broader range for Internet service. Because the unused channels vary across the country, the group proposes that consumers be able to buy generic devices, like PC cards for Wi-Fi, that would search for open frequencies and connect to a service.


Big cities tend to have many TV stations. That means there would be more white-space opportunity in rural areas, which also tend to be underserved by wired broadband services. If the plan worked, rural America could be dotted with high-speed wireless Internet service providers - like having a Starbucks on every corner, minus the coffee.


The catch? Television broadcasters argue that these services would interfere with digital television reception. The National Association of Broadcasters has run TV spots depicting doomsday - an increasingly irritated woman banging on her TV because, "if Microsoft and other high-tech companies have their way, your TV could freeze up and become unwatchable."


Interference with broadcast television wouldn't affect the more than 80 percent of television-viewing households who don't watch over-the-air TV anyway. And it wouldn't be a problem if the white-space gadgets work as intended, avoiding frequencies where there is any conflict. Unfortunately, that is not a given: a Microsoft device failed F.C.C. engineering tests earlier this year.


The F.C.C. has shown some bravery in maintaining the open-access auction rules, despite efforts to dismantle them. Now the commission needs to deal with the arguments of the broadcasters.


While there is certainly a risk that white-space Internet devices could interfere with some television signals, the potential for cheap, accessible wireless broadband is too great to pass up.


Judith Chevalier is a professor of economics and finance at the Yale School of Management.





Technorati :

Climate change blamed for fading foliage : observation



For ovservation of climate change ,


Every fall, Marilyn Krom tries to make a trip to Vermont to see its famously beautiful fall foliage.



This year, she noticed something different about the autumn leaves.


"They're duller, not as sparkly, if you know what I mean," Krom, 62, a registered nurse from Eastford, Conn., said during a recent visit. "They're less vivid."


Other "leaf peepers" are noticing, too, and some believe climate change could be the reason.


Forested hillsides usually riotous with reds, oranges and yellows have shown their colors only grudgingly in recent years, with many trees going straight from the dull green of late summer to the rust-brown of late fall with barely a stop at a brighter hue.


"It's nothing like it used to be," said University of Vermont plant biologist Tom Vogelmann, a Vermont native.


He says autumn has become too warm to elicit New England's richest colors.


According to the National Weather Service, temperatures in Burlington have run above the 30-year averages in every September and October for the past four years, save for October 2004, when they were 0.2 degrees below average.


Warming climate affects trees in several ways.


Colors emerge on leaves in the fall, when the green chlorophyll that has dominated all spring and summer breaks down.


The process begins when shorter days signal leaves to form a layer at the base of their stems that cuts off the flow of water and nutrients. But in order to hasten the decline of chlorophyll, cold nights are needed.


In addition, warmer autumns and winters have been friendly to fungi that attack some trees, particularly the red and sugar maples that provide the most dazzling colors.


"The leaves fall off without ever becoming orange or yellow or red. They just go from green to brown," said Barry Rock, a forestry professor at the University of New Hampshire.


He says 2004 was "mediocre, 2005 was terrible, 2006 was pretty bad although it was spotty. This year, we're seeing that same spottiness."


"Leaf peeping" is big business in Vermont, with some 3.4 million visitors spending nearly $364 million in the fall of 2005, according to state estimates.


State tourism officials reject the notion that nature's palette is getting blander. Erica Housekeeper, spokeswoman for the state Department of Tourism and Marketing, said she had heard nothing but positive reports from foresters and visitors alike this year.


The problem is perception, Housekeeper says: Recollections of autumns past become tinged by nostalgia.


"Sometimes, we become our own worst critics," Housekeeper said.


People who rely on autumn tourism in New England are worried.


"I don't have a sense that the colors are off, but the timing is definitely off," said Scott Cowger, owner and innkeeper at the Maple Hill Farm Bed & Breakfast Inn at Hallowell, Maine.


"Some trees are just starting to change now," Cowger said Thursday. "It used to be, religiously, it was the second week of October when they were at their peak. I would tell my guests to come the second week if you want to see the peak colors. But it's definitely the third or fourth week at this point."


People in Northampton, Mass., are still waiting on fall color. If foliage-viewing is the goal, "I wouldn't send anybody down this way yet," Autumn Inn desk clerk Mary Pelis said this past week.


"The way things are going, the foliage season is the one sure thing for us," said Amie Emmons, innkeeper at the West Mountain Inn, in Arlington, Vt. "We book out two years in advance. It's very concerning if you think the business could start to be affected




Technorati :

Errors blamed for nuclear arms going undetected


art.bruce.emig.af.jpgAir Force weapons officers assigned to secure nuclear warheads failed on five occasions to examine a bundle of cruise missiles headed to a B-52 bomber in North Dakota, leading the plane's crew to unknowingly fly six nuclear-armed missiles across the country.


That August flight, the first known incident in which the military lost track of its nuclear weapons since the dawn of the atomic age, lasted nearly three hours, until the bomber landed at Barksdale Air Force Base in northern Louisiana.


But according to an Air Force investigation presented to Defense Secretary Robert M. Gates on Friday, the nuclear weapons sat on a plane on the runway at Minot Air Force Base in North Dakota for nearly 24 hours without ground crews noticing the warheads had been moved out of a secured shelter.


"This was an unacceptable mistake," said Air Force Secretary Michael W. Wynne at a Pentagon news conference. "We would really like to ensure it never happens again."


For decades, it has been military policy to never discuss the movement or deployment of the nuclear arsenal. But Wynne said the accident was so serious that he ordered an exception so the mistakes could be made public.


On Aug. 29, North Dakota crew members were supposed to load 12 unarmed cruise missiles in two bundles under the B-52's wings to be taken to Louisiana to be decommissioned. But in what the Air Force has ruled were five separate mistakes, six missiles contained nuclear warheads.


According to the investigation, the chain of errors began the day before the flight when Air Force officers failed to inspect five bundles of cruise missiles inside a secure nuclear weapons hangar at Minot. Some missiles in the hangar have nuclear warheads, some have dummy warheads, and others have neither, officials said.


An inspection would have revealed that one of the bundles contained six missiles with nuclear warheads, investigators said.


"They grabbed the wrong ones," said Maj. Gen. Richard Newton, the Air Force's deputy chief of staff in charge of operations.


After that, four other checks built into procedures for checking the weapons were overlooked, allowing the plane to take off Aug. 30 with crew members unaware that they were carrying enough destructive power to wipe out several cities.


Newton said that even though the nuclear missiles were hanging on the B-52's wings overnight without anyone knowing they were missing, the investigation found that the Minot's tarmac was secure enough that the military was never at risk of losing control of the warheads.


The cruise missiles were supposed to be transported to Barksdale without warheads as part of a treaty that requires the missiles to be mothballed. Newton said the warheads are normally removed in the Minot hangar before the missiles are assigned to a B-52 for transport.


The Air Force did not realize the warheads had been moved until airmen began taking them off the plane at Barksdale. The B-52 had been sitting on the runway there for more than nine hours, however, before they were offloaded.


Newton did not say what explanation the Minot airmen gave investigators for their repeated failure to check the warheads once they left the secured hangar, saying only that there was inattention and "an erosion of adherence to weapons-handling standards."


Air Force officials who were briefed on the findings said investigators found that personnel lacked neither the time nor the resources to perform the inspections, indicating that the weapons officers had become lackadaisical in their duties.


One official noted that until the Air Force was given the task of decommissioning the cruise missiles this year, it had not handled airborne nuclear weapons for more than a decade, implying that most of the airmen lacked experience with the procedures.


The Air Force has fired four colonels who oversaw aircraft and weapons operations at Minot and Barksdale, and some junior personnel have also been disciplined, Newton said. The case has been handed to a three-star general who will review the findings and determine whether anyone involved should face court-martial proceedings.


Despite the series of failures, Newton said, the investigation found that human error, rather than inadequate procedures, were at fault. Gates has ordered an outside panel headed by retired Gen. Larry D. Welch, a former Air Force chief of staff, to review the Pentagon's handling of nuclear weapons.




From CNN International :Air Force officers relieved of duty over loose nukes


A six-week probe into the mistaken flight of nuclear warheads across the country uncovered a "lackadaisical" attention to detail in day-to-day operations at the air bases involved in the incident, an Air Force official said Friday.


Four officers -- including three colonels -- have been relieved of duty in connection with the August 29 incident in which a B-52 bomber flew from Minot Air Force Base in North Dakota to Barksdale Air Force Base in Louisiana.


The plane unknowingly carried a payload of nuclear-tipped cruise missiles.


"Nothing like this has ever occurred," Newton said.


"Our extensive, six-week investigation found that this was an isolated incident and that the weapons never left the custody of airmen -- were never unsecured -- but clearly this incident is unacceptable to the people of the United States and to the United States Air Force."


The probe also found there was "an erosion of adherence to weapons-handling standards at Minot Air Force Base and at Barksdale Air Force Base," Newton said.


"We have acted quickly and decisively to rectify this," he added.





Relieved of duty were the Minot wing commander and maintenance crew commander, and the Barksdale operational group commander.


Minot's munitions squadron commander was relieved of duty shortly after the incident.


Newton didn't name any of the officers, but Col. Bruce Emig had been the commander of the 5th Bomb Wing at Minot.


A number of other personnel -- "under 100," Newton said, including the entire 5th Bomb Wing at Minot -- have lost their certification to handle sensitive weaponry.


The matter will be referred to an Air Force convening authority to find out whether there's enough evidence to bring charges or any other disciplinary action against any personnel, Newton said.


Air Force Secretary Michael Wynne called the incident "an unacceptable mistake and a clear deviation for our exacting standards." VideoWatch Wynne talk about the breakdown in procedures »


"We are making all appropriate changes to ensure that this has a minimal chance of happening again, but we would really like to ensure that it never happens again," he said.


Wynne has convened a blue-ribbon panel to review all of the Air Force's security procedures and adherence to them. That panel is to report back on January 15.


The probe into the incident, which ended this week, lists five errors -- all of them procedural failures to check, verify and inspect, Newton said.


The investigation found that nuclear warheads were improperly handled and procedures were not followed as the missiles were moved from their storage facility, transferred to the bomber and loaded onto it, Newton said.


The bomber carried six nuclear warheads on air-launched cruise missiles, but the warheads should have been removed from the missiles before they were attached to the B-52.


A munitions crew at Barksdale followed proper procedure when the plane landed, discovering the error and reporting it up the chain of command, Newton said.


The weapons were secured in the hands of airmen at all times and had been stored properly at Minot, Newton said.





Technorati : ,

A space odyssey :Man on the moon


Man on the moon, Malaysian in space... what else does the future hold?


WATCHING the Soyuz rocket launcher take off last week, carrying the first Malaysian into space, my mind went back 38 years ago when I viewed, on TV, the first manned landing on the Moon.


At least, that's what I hope it was, because my recollection of the event is somewhat blurred. Actually, it was ostensibly something of great significance that I watched, back in 1969 - I just can't recall what I made of it as a mere lad.


We did not have a TV set at home, and I vaguely remember my father talking about some major cosmic event. Sometime in the evening, the family walked over to a neighbour's house to watch this on TV.


There was something said about landing on the moon, and of course, the name "Apollo 11" was bandied around, and also, Neil Armstrong. What I saw on TV remains vaguely imprinted on my mind ... fuzzy black and white images perhaps, maybe even the voices of men communicating through a vast distance, although I can't ascertain to this day if these were merely figments of my imagination.


Later, I learnt I had watched the first manned lunar landing.


Almost four decades on, it's still a bit blur to me, whether I actually saw the event on TV. I know and believe man landed on the moon in 1969, despite the conspiracy theories that have been floating around since the 1970s. The point is - did we watch it on TV the way I remember? Does anyone have a clear recollection of the events as shown on TV? I, for one, would certainly like to know.


Give me the details, everything! Anyway, looking ahead, when manned landings on the Moon resume, perhaps we'll see a Malaysian traipse across the lunar surface.


Think of the excellent spin-offs in the food and beverage industry - anyone for a teh tarik lunar or roti canai Marikh? In fact, I'm surprised we haven't seen items like mee goreng Soyuz or roti ISS being offered so far.


Any takers?


While we're on the subject of celestial objects... I recently received an irate missive from a friend who was pretty pissed off because, somewhere, someone had written that we would look to colonising the moon and other planets in future because we would run out of space on Earth.


He was angry that anyone would perpetrate such false hope, because, as far as he and the rest of mankind were concerned, we had just one habitable planet and that was it - screw it up and we're on the slide to oblivion.


He offered one thought to control the expanding population - have fewer children - and cited China as an example. Me, I'm an idealist, having grown up on a steady science-fiction diet of Arthur C. Clarke, Isaac Asimov, Ray Bradbury and Star Trek. I would like to think - well, hope - that somewhere in the future, in half a millennium or so, man would have reached across the chasms of space to call other worlds home.


And that these worlds would be better than we could ever imagine Earth to be now ... and that people like you and me could hop regularly across cosmic distances without undergoing months of rigorous specialised training for this.


Hopes and dreams, after all, are what take us forward and indeed, make today's farfetched vision a reality. Look at how far we've come in communication technology - just take the mobile phone and the Internet for example. Hmm, then again...


Dreaming may be the safer option.




Technorati : ,

Show Tests Roaches' Radiation Resistance


source : AP


Would cockroaches survive a nuclear holocaust that killed everything else? That question is being tested this week at the nearby Hanford nuclear reservation by a team from the "Mythbusters" show on the Discovery Channel, which expects to air the episode in about four months.


"It's been on the original list of myths since day one," said Kari Byron, who appears on the cable television series and was in town with Grant Imahara and Tory Belleci for the tests.


The crew is using an irradiator in the basement of Hanford's 318 Building just north of Richland. Pacific Northwest National Laboratory usually uses the device to calibrate dosimeters, which measure radiation exposure to humans and animals, and to check for radiation damage of video cameras, fiber optic cables and other equipment.


Lab operators agreed to the research for purposes of science education and workers donated their time, in some cases using part of their vacation allotments.


On Thursday afternoon, Byron and Imahara were cramming their uncooperative critters into a specially built roach condo to be exposed in the irradiator.


"I had to put myself in quite the mind-set to do it," Byron said.


A scientific supply company sent 200 cockroaches for the tests, "all laboratory-grade, farm fresh," Imahara said.


A control group of 50 will get no radiation, 50 others will be exposed to 1,000 rad, a lethal load of radiation for humans, 50 will be exposed to 10,000 rad and the last 50 to 100,000 rad.


The bugs will be watched over the next couple of weeks to see how soon they die.


"Contrary to popular belief, not a significant amount of research goes into cockroach radiation," Imahara said.


Flour beetles and fruit flies, also being irradiated for comparison, were a snap compared with the cockroaches, which did not take well to being corralled within a tiny block arrangement designed to make sure each bug gets the same dosage.


"They are very fast. They are very aggressive. They want to get away," Byron said. "They are opportunists."


The surviving bugs get a chauffeured ride back to San Francisco. A "Mythbusters" employee has been detailed to drive them because airlines won't let them in the passenger cabin and they can't be placed in the baggage hold without wrecking the experiment.


"We have to maintain reasonable temperature and humidity so they don't go into shock," Imahara said.




Technorati :

Friday, October 19, 2007

Viacom mounts renewed attack on Google


Recent time you tube have taken innitiative to develop copyrights system to protect piracy .


The content provider says the seach giant is not doing enough to prevent clips being illegally shown on YouTube.


Viacom ramped up its offensive against Google yesterday, saying that it would not back down from its $1 billion lawsuit against the internet search engine.


Philippe Dauman, Viacom's chief executive, said that Google had not done enough to prevent content from being illegally uploaded to YouTube, and gave no impression that a settlement was near to being reached.


Viacom, the entertaiment company which owns MTV and Nickelodeon, claims that Google allowed more than 160,000 clips of its programming to be uploaded to YouTube, the video-sharing website it owns.


Google denies that is infringeing Viacom's copyright, and claims that it removes unauthorised videos from YouTube when asked to by content owners.


Speaking at an internet conference in San Francisco, Mr Dauman said that he had "an open mind" about reaching an agreement with Google, which he described as a "responsible company", but that a settlement "wasn't quite there yet".


Referring to Google's proposed solution to the problem, a filtering system which allows new content being uploaded to be checked against a database of copyright material, he said: "They have a lot of tools, but they're not perfect. What no-one wants is a proprietary system that benefits one company to the exclusion of others."


Mr Dauman said that what he would prefer would be an industry standard system, adding that it was "beyond the capacity of a company like ours, let alone smaller ones", to cope with a range of filtering technologies.


Earlier in the day Viacom and a range of other content producers, including Disney, CBS, Fox and NBC, as well as internet companies such as Microsoft and MySpace, announced that they would collaborate on a technology which would prevent users from uploading unauthorised material.


Google was not a party to the list, although analysts said it was not feasible that it continue to use its own technology while the rest of the internet and content industries were working to a common standard.


There was "a developing consensus among content creators and distributors" that whilst it was important content be widely available via the internet, there needed to be "rules of the road", Mr Dauman told an assembled audience at the Web 2.0 Summit in San Francisco.


The complaint of media companies such as Viacom is that they should bear part of the onus - and cost - of policing sites such as YouTube, onto which vast amounts of content are uploaded each day, for unauthorised content.


Google argues that it complies with the terms of the Digital Millennium Copyright Act (DMCA), and takes down copyright-infringeing material when requested to do so by the copyright owner.




Viacom's Bet on Web Diversity.



Viacom CEO Philippe Dauman, whose company sued Google last year for $1 billion for alleged copyright violations on Google's YouTube video sharing site, journeyed into the belly of the beast a few minutes ago. He was, not surprisingly, unapologetic about the suit, which was not popular among the Web digerati. But in the process of defending his position, he did make it clear that Viacom is betting big on the notion that people online will travel to hundreds of individual Web sites for the content they want to view. That was underscored by today's announcement that Viacom would make clips of segments from The Daily Show With Jon Stewart available online for free. "We believe in fragmentation going forward on the Internet."

Of course, no one person wants to see all of Viacom's offerings, but I wonder if people really will click directly to all that many individual sites. The rise of YouTube may well depend on the presence of unauthorized videos, but there's a reason people flock there: They can find what they're looking for without having to click all over the Net. As Cisco senior VP Dan Scheinman said just a few minutes before, "The challenge of our era is, how do we find anything?"

Search helps, but it's clearly not the whole answer anytime soon. And I think people, online or off, want to gather where there are a whole bunch of other people.

Can Viacom fight that reality? Maybe, if it can get enough critical mass of fans for each of those sites. And it's hard to argue with $500 million in online revenue. But I can't imagine that will ever be completely sufficient. Still seems like there's more benefit in using YouTube--whose videos are hardly HDTV-quality--as a way to drive traffic to Viacom than in suing it and preventing users from finding what they want.




Technorati :

Space Shuttle Crew Arrives in Florida - STS-120 Bringing Space Station ‘Harmony’


The Shuttle Training Aircraft carrying the STS-120 astronauts has just arrived NASA's Kennedy Space Center in Florida. The jet departed Ellington Field in Texas at about 11:24 a.m. EDT. Their arrival is being carried live on NASA TV.


Discovery crew. Image Above: Commander Pam Melroy, with her crew behind her, greets the press upon touch down at the Shuttle Landing Facility in Florida. Image credit: NASA TV

This follows the detailed flight readiness review on Tuesday, after which NASA senior managers announced Oct. 23 as the official launch date. Commander Pam Melroy and her six crewmates are scheduled to lift off at 11:38 a.m. EDT on their mission to the International Space Station.

The 14-day mission includes five spacewalks - four by shuttle crew members and one by the station's Expedition 16 crew. Discovery is expected to complete its mission and return home at 4:47 a.m. EST on Nov. 6.

Mission Information
+ STS-120 Mission Overview
+ Harmony Node 2
+ Space Shuttle Discovery

Media Resources
+ Mission TV Schedule
+ Press Kit (8.5 Mb PDF)
+ Fact Sheet (2 Mb PDF)


Source : http://www.nasa.gov/mission_pages/shuttle/main/index.html


STS-120 Bringing Space Station 'Harmony'


Think of the next component set for delivery to the International Space Station as an international crossroads in space. That's the major function of the Italian-built U.S. module that will be ferried to the station aboard space shuttle Discovery during mission STS-120. Harmony naming ceremony

The pressurized component was named "Harmony" by U.S. students in a nationwide contest. "This module will allow all international partner pieces of the station to connect together, so it's really wonderful that kids recognize that harmony is necessary for space cooperation," said Bill Gerstenmaier, NASA's associate administrator for space operations, when the six winning schools who submitted the name were announced.

Image at Left: Earlier this year, STS-120 Commander Pam Melroy and Pilot George Zamka unveiled the winning name, Harmony, for the Node 2 module. Image credit: NASA/Jim Grossman

The module will be the connecting point between the U.S. Destiny lab, the European Space Agency's Columbus module and the Japanese Kibo module. Harmony's delivery to the station sets the stage for the following two space shuttle flights that will carry the Columbus and Kibo components to the station.

Those laboratory modules have been prepared side by side with Harmony in the high bay of the Space Station Processing Facility at NASA's Kennedy Space Center in Florida.

Harmony was delivered to Florida in 2003 aboard an Airbus Beluga aircraft. Harmony Node 2 arrives in Florida from Italy "The first thing we did was a post-delivery inspection and checkout to make sure the ferry flight from Italy to Florida didn't cause any damage to the module," said Glenn Chin, the payload mission manager for STS-120. Since then, a team of technicians has worked to prepare it for flight.

Image at Right: In 2003, a Beluga aircraft brought the module from Italy, where it was built, to the Kennedy Space Center in Florida, where it is transferred to the Space Station Processing Facility. Image credit: NASA

Weighing 31,500 pounds when loaded for flight, the 24-foot-long component will be parked in a temporary spot on the space station by the STS-120 crew, since the docked shuttle will be occupying Harmony's permanent spot. Later, the space station crew will relocate the module to its permanent location. Harmony will be the fourth named U.S. module on the station, taking its place with the Destiny laboratory, the Quest airlock and the Unity node. Harmony in high bay

Image at Left: Inside the high bay of the Space Station Processing Facility, a crane carries Harmony toward a weight stand. Image credit: NASA/Jim Grossman

"Harmony has been here with us for four years," explained Chin. "We've been working hard getting it ready for flight, and we've finally come to this stage of getting it ready to be put into the orbiter."

Before each mission, the astronauts who will deliver the space station element spend time inside the Space Station Processing Facility. "We had the crew here on about four different occasions" inspecting the module they will install on the orbiting outpost, said Chin.

While the excitement for the launch builds among those who have been working on the module since its arrival, there are also some mixed feelings about seeing it depart for its final destination.

Glenn Chin, Harmony payload mission manager Image at Right: Glenn Chin, mission manager for Harmony, describes the module during a media event in the Space Station Processing Facility. Image credit: NASA/Kim Shiflett

"It's an exhilarating feeling of excitement and we're all anxious to see Harmony get to the pad," said Chin. "I've never been a mission manager for any other mission that was as challenging." But even with all the excitement of the upcoming mission, he added that after working with the module for four years, "we'll definitely miss Harmony."

For more information on space station processing, visit the Observation Deck
+ View Harmony Video







Technorati :

Is Mars alive, or is it only sleeping?



This is a shaded relief image derived from Mars Orbiter Laser Altimeter data, which flew onboard the Mars Global Surveyor. The image shows Olympus Mons and the three Tharsis Montes volcanoes: Arsia Mons, Pavonis Mons, and Ascraeus Mons from southwest to northeast. Credit: NASA



The surface of Mars is completely hostile to life as we know it. Martian deserts are blasted by radiation from the sun and space. The air is so thin, cold, and dry, if liquid water were present on the surface, it would freeze and boil at the same time. But there is evidence, like vast, dried up riverbeds, that Mars once was a warm and wet world that could have supported life. Are the best times over, at least for life, on Mars?
New research raises the possibility that Mars could awaken from within -- three large Martian volcanoes may only be dormant, not extinct. Volcanic eruptions release lots of greenhouse gasses, like carbon dioxide, into the atmosphere. If the eruptions are not complete, and future eruptions are large enough, they could warm the Martian climate from its present extremely cold and dry state.


NASA-funded researchers traced the flow of molten rock (magma) beneath the three large Martian volcanoes by comparing their surface features to those found on Hawaiian volcanoes.


"On Earth, the Hawaiian islands were built from volcanoes that erupted as the Earth's crust slid over a hot spot -- a plume of rising magma," said Dr. Jacob Bleacher of Arizona State University and NASA's Goddard Space Flight Center in Greenbelt, Md. "Our research raises the possibility that the opposite happens on Mars - a plume might move beneath stationary crust." The observations could also indicate that the three Martian volcanoes might not be extinct. Bleacher is lead author of a paper on these results that appeared in the Journal of Geophysical Research, Planets, September 19.


The three volcanoes are in the Tharsis region of Mars. They are huge compared to terrestrial volcanoes, with each about 300 kilometers (186 miles) across. They form a chain heading northeast called the Tharsis Montes, from Arsia Mons just south of the Martian equator, to Pavonis Mons at the equator, to Ascraeus Mons slightly more then ten degrees north of the equator.


No volcanic activity has been observed at the Tharsis Montes, but the scarcity of large impact craters in the region indicates that they erupted relatively recently in Martian history. Features in lava flows around the Tharsis Montes reveal that later eruptions from large cracks, or rift zones, on the sides of these volcanoes might have started at Arsia Mons and moved northeast up the chain, according to the new research.


The researchers first studied lava flow features that are related to the eruptive history of Hawaiian volcanoes. On Hawaii (the Big Island), the youngest volcanoes are on the southeastern end, directly over the hot spot. As the Pacific crustal plate slowly moves to the northwest, the volcanoes are carried away from the hotspot. Over time, the movement has created a chain of islands made from extinct volcanoes.


Volcanoes over the hot spot have the hottest lava. Its high temperature allows it to flow freely. A steady supply of magma from the hot spot means the eruptions last longer. Lengthy eruptions form lava tubes as the surface of the lava flow cools and crusts over, while lava continues to flow beneath. After the eruption, the tube empties and the surface collapses, revealing the hidden tube.


As the volcano is carried away from the hot spot, magma has to travel farther to reach it, and the magma cools. Cooler magma makes the lava flow more slowly compared to lava at the younger volcanoes, like the way molasses flows more slowly than water. The supply of magma is not as steady, and the eruptions are shorter. Brief eruptions of slowly flowing lava form channels instead of tubes. Flows with channels partially or completely cover the earlier flows with tubes.


As the volcano moves even further from the hot spot, only isolated pockets of rising magma remain. As the magma cools, it releases trapped gas. This creates short, explosive eruptions of cinders (gas bubbles out of the lava, forming sponge-like cinder stones). Earlier flows become covered with piles of cinders, called cinder cones, which form around these eruptions.

"We thought we could take what we learned about lava flow features on Hawaiian volcanoes and apply it to Martian volcanoes to reveal their history," said Bleacher. "The problem was that until recently, there were no photos with sufficient detail over large surface areas to reveal these features on Martian volcanoes. We finally have pictures with enough detail from the latest missions to Mars, including NASA's Mars Odyssey and Mars Global Surveyor, and the European Space Agency's Mars Express missions."

Using images and data from these missions, the team discovered that the main flanks of the Tharsis Montes volcanoes were all alike, with lava channels covering the few visible lava tubes. However, each volcano experienced a later eruption that behaved differently. Lava issued from cracks (rifts) on the sides of the volcanoes, forming large lava aprons, called rift aprons by the team.

The new observations show that the rift apron on the northernmost volcano, Ascraeus Mons, has the most tubes, many of which are not buried by lava channels. Since tube flows are the first to form over a hot spot, this indicates that Ascraeus was likely active more recently. The flow on the southernmost volcano, Arsia Mons, has the least tubes, indicating that its rift aprons are older. Also, the team saw more channel flows partially burying tube flows at Arsia. These trends across the volcanic chain indicate that the rift aprons might have shared a common source like the Hawaiian volcanoes, and that apron eruptions started at Arsia, then moved northward, burying the earlier tube flows at Arsia with channel flows.

Since there is no evidence for widespread crustal plate movement on Mars, one explanation is that the magma plume could have moved beneath the Tharsis Montes volcanoes, according to the team. This is opposite to the situation at Hawaii, where volcanoes move over a plume that is either stationary or moving much more slowly. Another scenario that could explain the features is a stationary plume that spreads out as it nears the surface, like smoke hitting a ceiling. The plume could have remained under Arsia and spread northward toward Ascraeus. "Our evidence doesn't favor either scenario, but one way to explain the trends we see is for a plume to move under the stationary Martian crust," said Bleacher.

The team also did not see any cinder cone features on any of the Tharsis Montes rift apron flows. Since cinder cone eruptions are the final stage of hot spot volcanoes, the rift apron eruptions might only be dormant, not extinct, according to the team. If the eruptions are not complete, and future eruptions are large enough, they could contribute significant amounts of water and carbon dioxide to the Martian atmosphere.




Technorati :

Researchers measure carbon nanotube interaction


An artist's representation of an amine functional group attached to an AFM tip approaching a carbon nanotube surface in toluene solution. Translucent blue shape on the nanotube represents the polarization charge forming on the nanotube as the result of the interaction with the approaching molecule. Chemical force microscopy measures the tiny forces generated by this single functional group interaction. (Illustration by Scott Dougherty, LLNLnanotubeCarbon nanotubes have been employed for a variety of uses including composite materials, biosensors, nano-electronic circuits and membranes.




While they have proven useful for these purposes, no one really knows much about what's going on at the molecular level. For example, how do nanotubes and chemical functional groups interact with each other on the atomic scale? Answering this question could lead to improvements in future nano devices.


In a quest to find the answer, researchers for the first time have been able to measure a specific interaction for a single functional group with carbon nanotubes using chemical force microscopy - a nanoscale technique that measures interaction forces using tiny spring-like sensors. Functional groups are the smallest specific group of atoms within a molecule that determine the characteristic chemical reactions of that molecule.


A recent report by a team of Lawrence Livermore National Laboratory researchers and colleagues found that the interaction strength does not follow conventional trends of increasing polarity or repelling water. Instead, it depends on the intricate electronic interactions between the nanotube and the functional group.
This work pushes chemical force microscopy into a new territory," said Aleksandr Noy, lead author of the paper that appears in the Oct. 14 online issue of the journal, Nature Nanotechnology.


Understanding the interactions between carbon nanotubes (CNTs) and individual chemical functional groups is necessary for the engineering of future generations of sensors and nano devices that will rely on single-molecule coupling between components. Carbon nanotubes are extremely small, which makes it particularly difficult to measure the adhesion force of an individual molecule at the carbon nanotube surface. In the past, researchers had to rely on modeling, indirect measurements and large microscale tests.


But the Livermore team went a step further and smaller to get a more exact measurement. The scientists were able to achieve a true single function group interaction by reducing the probe-nanotube contact area to about 1.3 nanometers (one million nanometers equals one millimeter).


Adhesion force graphs showed that the interaction forces vary significantly from one functionality to the next. To understand these measurements, researchers collaborated with a team of computational chemists who performed ab initio simulations of the interactions of functional groups with the sidewall of a zig-zag carbon nanotube. Calculations showed that there was a strong dependence of the interaction strength on the electronic structure of the interacting molecule/CNT system. To the researchers delight, the calculated interaction forces provided an exact match to the experimental results.


"This is the first time we were able to make a direct comparison between an experimental measurement of an interaction and an ab initio calculation for a real-world materials system," Noy said. "In the past, there has always been a gap between what we could measure in an experiment and what the computational methods could do. It is exciting to be able to bridge that gap."


This research opens up a new capability for nanoscale materials science. The ability to measure interactions on a single functional group level could eliminate much of the guess work that goes into the design of new nanocomposite materials, nanosensors, or molecular assemblies, which in turn could help in building better and stronger materials, and more sensitive devices and sensors in the future.




Thin films of silicon nanoparticles roll into flexible nanotubes


By depositing nanoparticles onto a charged surface, researchers at the University of Illinois at Urbana-Champaign have crafted nanotubes from silicon that are flexible and nearly as soft as rubber.


"Resembling miniature scrolls, the nanotubes could prove useful as catalysts, guided laser cavities and nanorobots," said Sahraoui Chaieb, a professor of mechanical and industrial engineering at Illinois and a researcher at the Beckman Institute for Advanced Science and Technology.


To create their flexible nanotubes, Chaieb and his colleagues - physics professor Munir Nayfeh and graduate research assistant Adam Smith - start with a colloidal suspension of silicon nanoparticles (each particle is about 1 nanometer in diameter) in alcohol. By applying an electric field, the researchers drive the nanoparticles to the surface of a positively charged substrate, where they form a thin film.


Upon drying, the film spontaneously detaches from the substrate and rolls into a nanotube. Nanotubes with diameters ranging from 0.2 to 5 microns and up to 100 microns long have been achieved.


Using an atomic force microscope, the researchers found that the Young's modulus (a measure of a material's elasticity) of the film was about 5,000 times smaller than that of bulk silicon, but just 30 times larger than that of rubber.


"We suspect that the nanotubes consist of silicon nanoparticles held together by oxygen atoms to form a three-dimensional network," Chaieb said. "The nanotubes are soft and flexible because of the presence of the oxygen atoms. This simple bottom-up approach will give other researchers ideas how to build inexpensive active structures for lab-on-chip applications."


"Because the silicon nanoparticles - which are made using a basic electrochemical procedure - have properties such as photoluminescence, photostability and stimulated emission, the resulting nanotubes might serve as nanodiodes and flexible lasers that could be controlled with an electric field," Nayfeh said.


The results will be reported in an upcoming issue of the journal Applied Physics Letters. The work was funded by the National Science Foundation and the state of Illinois.






Technorati :

code


<a href="http://technorati.com/claim/vb3mmkmbk" rel="me">Technorati Profile</a>


Brain Images Make Cognitive Research more Believable :neuroscience


Brain is most complexive thing, how barin works , how brain think, what accept or reject,, and much more is the vital research matter .People are more likely to believe findings from a neuroscience study when the report is paired with a colored image of a brain as opposed to other representational images of data such as bar graphs, according to a new Colorado State University study. (Credit: iStockphoto/Aaron Kondziela)




People are more likely to believe findings from a neuroscience study when the report is paired with a colored image of a brain as opposed to other representational images of data such as bar graphs, according to a new Colorado State University study.


Persuasive influence on public perception.


Scientists and journalists have recently suggested that brain images have a persuasive influence on the public perception of research on cognition. This idea was tested directly in a series of experiments reported by David McCabe, an assistant professor in the Department of Psychology at Colorado State, and his colleague Alan Castel, an assistant professor at University of California-Los Angeles. The forthcoming paper, to be published in the journal Cognition, was recently published online.


"We found the use of brain images to represent the level of brain activity associated with cognitive processes clearly influenced ratings of scientific merit," McCabe said. "This sort of visual evidence of physical systems at work is typical in areas of science like chemistry and physics, but has not traditionally been associated with research on cognition.


"We think this is the reason people find brain images compelling. The images provide a physical basis for thinking."


Brain images compelling


In a series of three experiments, undergraduate students were either asked to read brief articles that made fictitious and unsubstantiated claims such as "watching television increases math skills," or they read a real article describing research showing that brain imaging can be used as a lie detector.


When the research participants were asked to rate their agreement with the conclusions reached in the article, ratings were higher when a brain image had accompanied the article, compared to when it did not include a brain image or included a bar graph representing the data.


This effect occurred regardless of whether the article described a fictitious, implausible finding or realistic research.


Conclusions often oversimplified and misrepresented


"Cognitive neuroscience studies which appear in mainstream media are often oversimplified and conclusions can be misrepresented," McCabe said. "We hope that our findings get people thinking more before making sensational claims based on brain imaging data, such as when they claim there is a 'God spot' in the brain."


Article: "Seeing is believing: The effect of brain images on judgments and scientific reasoning."





Technorati :

MIT finds new role for well-known protein


Research could lead to treatments for Alzheimer's, Parkinson's,


Fluorescent micrograph (scale bar: 10 micrometers) shows yeast cells (red) with septin (green), which enables the budding of daughter cells. MIT researchers have found septin also helps neurons sprout the branch-like protrusions used to communicate with other neurons. Image / Philippsen Lab, Biozentrum B




In a finding that may lead to potential new treatments for diseases such as Alzheimer's and Parkinson's, researchers at the Picower Institute for Learning and Memory at MIT report an unexpected role in the brain for a well-known protein.

A study by Morgan H. Sheng, Menicon Professor of Neuroscience and a Howard Hughes Medical Institute investigator, and colleagues appearing in the Oct. 23 issue of Current Biology shows that the same protein that enables a yeast cell to bud into two daughter cells also helps neurons sprout the branch-like protrusions used to communicate with other neurons.


The work revolves around septins--proteins known since the 1970s to play an essential function in the process through which the cytoplasm of a single yeast cell divides. "In yeast, septin is localized exactly at the neck between the yeast mother cell and the bud or emerging daughter cell," Sheng said. "Amazingly, we found septin protein localized at the base of the neck of neuronal dendritic spines and at the branchpoint of dendritic branches."


Nine of the 14 septins found in mammals are found in the brain. One of them, Sept7, appears the most, but its role was unclear. Septins form long filaments and act as scaffolds, recruiting other proteins into their assigned roles of builders of the cell infrastructure.


While neurons don't divide, they do form protrusions that eventually elongate into dendritic branches. Dendrites, from the Greek word for "tree," conduct electrical stimulation from other neurons to the cell body of the neuron from which the dendrites project.


Electrical stimulation is transmitted via synapses, which are located at various points along the dendritic branches. Dendrites play a critical role in receiving these synaptic inputs. "Because dendritic spines are important for synaptic function and memory formation, understanding of septins may help to prevent the loss of spines and synapses that accompanies many neurodegenerative diseases," said co-author Tomoko Tada, a postdoctoral associate in the Picower Institute. "Septin could be a potential target protein to treat these diseases."


Moreover, in the cultured hippocampal neurons the researchers used in the study, septin was essential for normal branching and spine formation. An abundance of septin made dendrites grow and proliferate while a dearth of septin made them small and malformed.


"Boosting septin expression and function would enhance the stability of spines and synapses, and therefore be good for cognitive functions such as learning and memory," Sheng said. His laboratory is now exploring ways to prevent septin degradation and loss.


In addition to Sheng and Tada, authors are MIT affiliates Alyson Simonetta and Matthew Batterton; Makoto Kinoshita of Kyoto University Graduate School of Medicine; and Picower postdoctoral associate Dieter Edbauer.


This work is supported by the National Institutes of Health and the RIKEN-MIT Neuroscience Research Center




Technorati :

Toxic Releases Down From North American Industry Leaders


Source :


Pollution , and purification is always a burning question , awaring the CEC's goal ,The latest Taking Stock report from the Commission for Environmental Cooperation (CEC) reveals that a continued decline in releases of toxic chemicals to the environment--15 percent for the United States and Canada from 1998 to 2004--is being driven by a group of industrial facilities that are the largest generators of emissions
The CEC report, however, also reveals that the leading role of the largest waste-producing facilities stands in stark contrast to a substantial increase in chemical releases and transfers by a much larger group of industrial facilities that report lower volumes of emissions.


Released October 18, the annual report compares industrial pollution from a matched set of facilities in Canada and the United States--three million tonnes of chemicals released or transferred in the two countries in 2004. Over one-third of that amount was released at the location of reporting facilities, including over 700,000 tonnes released to the air, with another third transferred to recycling. For the first time, the CEC report also provides data from Mexico. Across the three countries, metals and their compounds--lead, chromium, nickel and mercury--were reported by the highest proportion of facilities.


"The evidence is clear that industry and government action to limit chemical releases is showing steady progress," said Adrián Vázquez-Gálvez, CEC's executive director. "It is equally clear that a large number of small and medium-size industrial facilities need to do a better job in reducing their waste and emissions if we are going to see even greater progress in North America. We trust the progress shown by industry leaders and the fact that pollution prevention is a proven strategy will encourage everyone to tackle pollution issues at the source."


The CEC's analysis demonstrates that facilities from Canada and the United States that reported pollution prevention activities--product and process redesign, spill and leak detection, and substituting raw materials--showed reductions from 2002--2004. Facilities not engaged in these activities did not show similar progress.


A new chapter provides a detailed look at industrial recycling, finding that over one-third of US and Canadian releases and transfers reported in 2004--more than 1 million tonnes--were recycled. Recycling has increased in recent years due to increases in production and in scrap metal prices. Most of the materials were metals, including copper, zinc, lead and their compounds.


The trilateral analysis is based on matched data from some 9 industrial sectors, 56 chemicals, and 10,000 facilities, comparing releases and transfers for similar facilities in Canada, Mexico and the United States. The report identifies a different pattern of releases and transfers in each of the three countries.


Comparisons of the three countries' industrial emissions will continue to improve as the CEC works with governments, industry and NGOs to expand the number of chemicals and facilities that are comparable.


Taking Stock compiles data from Canada's National Pollutant Release Inventory, the United States' Toxics Release Inventory, and, starting with its first year of mandatory reporting in 2004, Mexico's pollutant release and transfer register, the Registro de Emisiones y Transferencia de Contaminantes




Technorati :

Nobel Awarded in economics for "mechanism design theory,"



"WHAT on earth is mechanism design?" was the typical reaction to this year's Nobel prize in economics, announced on October 15th. In this era of "Freakonomics", in which everyone is discovering their inner economist, economics has become unexpectedly sexy. So what possessed the Nobel committee to honour a subject that sounds so thoroughly dismal? Why didn't they follow the lead of the peace-prize judges, who know not to let technicalities about being true to the meaning of the award get in the way of good headlines?


In fact, despite its dreary name, mechanism design is a hugely important area of economics, and underpins much of what dismal scientists do today. It goes to the heart of one of the biggest challenges in economics: how to arrange our economic interactions so that, when everyone behaves in a self-interested manner, the result is something we all like. The word "mechanism" refers to the institutions and the rules of the game that govern our economic activities, which can range from a Ministry of Planning in a command economy to the internal organisation of a company to trading in a market.


The real world rarely behaves like economics models do, so mechanism design is used to design markets and auctions that will better reflect the actions of the participants. Mechanism design is also used to look at how companies behave and to consider how governments can best provision public goods like defense or infrastructure. In general, mechanism design is applied to interactions where people or companies participating in the mechanism may have reasons to behave in a non-truthful or less than optimal way, and attempts to create rules and incentives to discourage this unwanted behavior.


The winners of the 2007 Nobel Memorial Prize in Economics, announced yesterday, are Leonid Hurwicz, Eric Maskin, and Roger Myerson. The three men received the prize for their work on "mechanism design theory," a field of economics that focuses on creating incentives and rules for an economics interaction such that the desired outcome or some desirable properties are achieved.


Hurwicz began working on mechanism design over 50 years ago by applying mathematical analysis to companies and economics systems like capitalism and socialism. His major theoretical contribution is "incentive compatibility," where participants in a mechanism will want to vote or play honestly. It's an important result, since we tend to want mechanisms like voting systems to encourage truthful voting, rather than encouraging people to disguise their true opinions.


Although "mechanism design theory" may not sound like something you or I would need to interact with very much, it pops up in quite a few places. Take the upcoming 700MHz spectrum auctions, for example. For this auction, the government has some set of goals, including perhaps getting some payment and fairly allocating the spectrum. The companies also have goals, which may be to gobble up as much of the spectrum as possible. By applying some mechanism design theory to the situation, economists can then design an auction mechanism that best meets the goals of all the parties. This type of game theoretical analysis of auctions has been done by Roger Myerson, whose work has influenced these types of spectrum auctions.


Software patents are another area where mechanism design comes into play. One of the Nobel laureates, Eric Maskin, has done some work on patent valuation. In particular, Maskin is critical of the software patent system, which he believes is harmful to innovation when new inventions are closely related to old ones. His (very) basic argument is that in many technology fields, competition is actually better for firms in the long run. Patents generally lead to less innovation in a particular field, and also lead to less competition since companies can't work on the same types of products. Thus, in the end, patents are bad for software and technology companies, because of how they limit competition.


If you're at all interested in mechanism design theory, I would highly recommend checking out the scientific background for the prize, since it provides a nice overview of the key results from the work of Hurwicz, Maskin, and Myerson. It can be a bit daunting to delve into, particularly since it's not a field of economics that gets talked about at your average cocktail party, but it's worth a look due to the sheer number of social and governmental situations that rely on mechanism design to operate more efficiently




Technorati :

Thursday, October 18, 2007

DNA pioneer James Watson says he is 'mortified' by race comments : black people are less intelligent than whites.


24HOURS LONDON: The DNA pioneer James Watson today apologised "unreservedly" for his apparent claim that black people are less intelligent than whites.


"I am mortified about what has happened," he told a group of scientists and journalists at the launch of his new book, Avoid Boring People, at the Royal Society in London.


The American scientist at the center of a media storm over comments suggesting that black people were not as intelligent as whites said Thursday he never meant to imply that the African continent was genetically inferior, adding that he was mortified over the attention his words had drawn.


James Watson, who won the Nobel Prize for co-discovering the molecular structure of DNA, has been sharply criticized in Britain for reportedly saying tests showed Africans did not have the same level of intelligence as whites.


In its profile of Watson, The Sunday Times Magazine quoted him as saying he was "inherently gloomy about the prospect of Africa" because "all our social policies are based on the fact that their intelligence is the same as ours - whereas all the testing says not really."


Watson's interview in the magazine received wide play, touching off a furious reaction in Britain. The Independent newspaper put Watson on its front page Wednesday, and on Thursday the Daily Mail devoted a column to criticism of his "incendiary claim."


Watson, who arrived in Britain on Thursday to promote his new book, "Avoid Boring People: Lessons From a Life in Science," appeared at a reception Thursday night at the Royal Society, Britain's leading scientific academy.The Associated Press was refused entry to the event, described by his publicist as a private gathering with friends. But in a written statement given to the AP, Watson said he was "mortified by what had happened."


"I cannot understand how I could have said what I am quoted as having said," he said. "To all those who have drawn the inference from my words that Africa, as a continent, is somehow genetically inferior, I can only apologize unreservedly. That is not what I meant. More importantly from my point of view, there is no scientific basis for such a belief."


Kate Farquhar-Thomson, his publicist, refused to say whether Watson believed The Sunday Times had quoted him accurately. "You have the statement. That's it, I'm afraid," she said.


Watson, 79, is a molecular biologist who serves as chancellor of the Cold Spring Harbor Laboratory in New York, a world leader in research into cancer and neurological diseases. The laboratory issued a statement saying its board of trustees vehemently disagreed with his remarks and that they were "bewildered and saddened if he indeed made such comments."


The author of several books, Watson has been well-known in Britain since his days at Cambridge University in the 1950s and 1960s on the trail of DNA's molecular structure. Watson, Francis Crick and Maurice Wilkins won the 1962 Nobel Prize for their work on the subject.


In the magazine interview, Watson was quoted as saying he opposes discrimination and believes that "there are many people of color who are very talented." But he also was quoted as saying that while he hopes that everyone is equal, "people who have to deal with black employees find this not true." Watson's statement did not directly address those remarks.


The interview caused outrage in Britain.


David Lammy, the government's skills minister, said Thursday that Watson's remarks were "deeply offensive" and would "succeed only in providing oxygen" for the British National Party, a small, far-right political party that has been accused of being racist.


"It is a shame that a man with a record of scientific distinction should see his work overshadowed by his own irrational prejudices," Lammy said. "It is no surprise to me that the scientific community has condemned this outburst, and I think people will recognize these comments for what they are."


Watson has caused controversy in the past, reportedly saying that a woman should have the right to abort her unborn child if tests could determine it would be homosexual.


He also suggested a link between skin color and sex drive, proposing a theory that black people have higher libidos.


Jan Schnupp, a lecturer in neurophysiology at Oxford University, said Watson's remarks "make it very clear that he is an expert on genetics, not on intelligence."


Schnupp said undernourished and undereducated people often perform worse on intelligence tests than the well off.


"Race has nothing to do with it, and there is no fundamental obstacle to black people becoming exceptionally bright," Schnupp said.


Terms of Use






Technorati :

MCAS science results mixed


Bernazzani Elementary School pupil Andrew Spada, 10, hangs from a Velcro wall during a hip-hop science education concert focusing on gravity and motion at North Quincy High School. (DOMINIC CHAVEZ/GLOBE STAFF)




School administrators in Southeastern Massachusetts say they need to bolster scores on the state's new high school science exam, even though the majority of the region's public school districts outperformed the state average in the scores released this week.

Because passing the exam is becoming a high school graduation requirement starting with this year's 10th-graders, just doing better than the state average will not be good enough to ensure that the region's seniors all have high enough scores to get their diplomas, school administrators say.


The science exam was given for the first time this spring to ninth- and 10th-graders. Students can take the exam in biology, chemistry, physics, or engineering-technology. Under the new rules, a student must pass one of the four to get a diploma.


On biology, the most popular test, 39 of the 49 area districts where ninth- and 10th-graders took the test passed at a higher rate than the state average of 76 percent, according to a Globe analysis of state data released on Monday.


Norwell students did the best, with a pass rate of 99 percent; Randolph did the worst at 54 percent. The median pass rate for the region was 83 percent.


"We are happy when we see successful scores, but that doesn't mean we pause in the glory. We still need to continue to move forward to make all students successful," said June Doe, the superintendent of schools in Dedham where more than 90 percent of the high school students passed the biology and chemistry tests.


Because the state provided only the combined pass rates for grades 9 and 10, it is unclear how far or close this region's class of 2010 is in meeting the high school graduation requirement.


The following are highlights of the other three science tests results for the ninth- and 10th-graders:


Out of the 30 districts in the region in which students took the chemistry test, 25 outperformed the state passing rate of 61 percent. At five schools - Cohasset, Foxborough, Foxborough Regional Charter, Westwood, and Weymouth - all test-takers passed. Old Rochester and Hull tied for the lowest rate of 42 percent. The median passing rate for the region was 85 percent.


Out of the 13 districts in the region where students took the introduction to physics test, nine performed better than the state average of 78 percent. Sharon had the highest pass rate of 99 percent, while Blue Hills Regional Technical had the lowest at 15 percent. The region's median was 89 percent.


Only two districts took the engineering-technology test. Plymouth had a passing rate of 79 percent, 3 percentage points higher than the state, while Brockton had 58 percent.


To pass the exam, students need to score in the top three categories on MCAS - advanced, proficient, or needs improvement. The other category is warning-failin.


School administrators say they will offer remediation for students who did not pass the exams, but in many cases have not developed specifics. One sticking point: By the time the districts receive scores in the fall, students often already have moved onto a new science course.


So, a student who failed the biology exam but passed the biology course at the high school could now be sitting in a chemistry class, raising the question of whether it's better for the district to tutor the student in biology for a retake or devote additional help to the student in chemistry so he or she can take that state exam with better success.


"We will have to determine that on a case-by-case basis," said Brenda Hodges, superintendent of the Mansfield schools, where 92 percent of students passed the biology exam.


More broadly, administrators say they are reevaluating their science programs to make sure the courses align with state standards. A few districts said they might reshuffle the order in which students take courses, or change the length of time students have in classes.


In Carver, where 80 percent of students passed the biology exam, the high school might replace a freshman earth science course with an introductory physics class. The school would then stick with the current cycle of having students take biology in their sophomore year and chemistry in their junior year.


"We want to give more kids a chance to succeed," said Elizabeth Sorrell, superintendent of the Carver schools and a former science curriculum coordinator. "I do think the test is a good idea. Pretty much any job you do requires some science and technology."


In Walpole, where more than 90 percent of students passed the biology and chemistry tests, the high school has added another science teacher so classes are around 25 students, compared to about 30 last year.


"Students are now able to have their own lab stations in the science rooms," said Jean Kenney, director of curriculum, instruction, and grants in Walpole. "Our teachers have students focus a lot on hands-on work. We try to make it real world science. We want applications [of science] to be seen as everyday examples."


Getting students to pass the science exam comes down to one basic principle - having them practice the knowledge and skills, said Abington Superintendent of Schools Peter Schafer. "It's like hitting a baseball," Schafer said. "The more you go to the batting cage, the better you get."


The effort to improve extends beyond high schools. Many districts are revamping curriculums in middle and elementary schools, too. Administrators say it's critical to make students excited about science at the youngest age possible so they will want to keep studying it. State policy makers and business leaders say the state needs more highly trained workers for the science fields, especially in the booming biotechnology.


Quincy officials last month tried to entice fifth- and sixth-graders into a love affair with science by putting Newton's laws of motion and the universal law of gravitation to hip-hop lyrics. Students filled a school gymnasium to hear a special program put on by Honeywell and NASA that demonstrated the laws by using things like rocket launchers and Velcro against the backdrop of three hip-hop singers.


The show dazzled Meghan MacDonald, 11. Although she said she loves science, she said she is unsure whether she would pursue science as a career.


"I'm thinking about working with animals," she said, "but I've always wanted to go to the moon."




Technorati :