DXPG

Total Pageviews

Sunday, April 28, 2013

Court Ruling Takes a Stand on Essential High-Tech Patents

The high-tech patents wars are fed by the value of patents as weapons for extracting rich sums from companies and competitors.

But courts are blunting the patent weapon, at least for the kinds of patents deemed vital for communications and data-handling in devices like smartphones, tablets and online game consoles. That trend took another step with an opinion issued last Thursday by a judge for the United States District Court in Seattle.

In his 207-page ruling, Judge James L. Robart took on the issue of pricing for so-called standard-essential patents. These are patents that their corporate owners have pledged to license to others on terms that are “reasonable and nondiscriminatory,” often known as RAND. All well and good, but what is reasonable to the owner might seem like extortion to the licensee, depending on the price. That kind of standoff becomes more likely if the two companies negotiating are rivals in the marketplace.

With clear prose and some clever math, Judge Robart concluded that when a company has made a RAND commitment to an industry standards organization, the price should be low. That is especially important, he said, for the intellectual property in complex digital devices that are bundles of many hardware and software technologies.

The ruling, according to Arti K. Rai, a professor at the Duke University School of Law, “fits into a long line of recent cases in which courts are squarely rejecting attempts by patentees to claim high reasonable royalty figures when the patent in question is a just a small piece of the product.”

The case in federal court in Seattle is a breach-of-contract dispute between Microsoft and Motorola, whose mobile phone unit, Motorola Mobility, Google bought in 2011 for $12.5 billion. Google picked up 17,000 patents in the deal, which closed last year.

In essence, Microsoft argued that Motorola bargained in bad faith by initially offering outlandish terms to license its patents on a wireless communication standard, 802.11, and another standard for video compression, H264.

Microsoft contends that Motorola’s first offer, if applied to a wide range of Microsoft products, might result in royalty payments of more than $4 billion a year. Motorola has replied in court that opening offers are nearly always negotiated down substantially, and that Motorola was mainly seeking a license deal on Microsoft’s Xbox video console rather than Microsoft’s wider product portfolio.

Still, Judge Robart determined that a reasonable rate for licensing the Motorola patents would be just under $1.8 million a year. That is not far from what Microsoft was offering as reasonable, about $1.2 million a year.

In his ruling, the judge set out some basic principles. An important one, he said, is that “a RAND royalty should be set at a level consistent with the S.S.O.s’ (standard setting organizations) goal of promoting widespread adoption of their standards.”

Later, Judge Robart explained the problem with relatively high royalties on standard-essential patents. He noted that at least 92 companies and organizations hold patents involved in the 802.11 standard for wireless communication. If they all sought the same terms as Motorola, he wrote, “the aggregate royalty to implement the 802.11 standard, which is only one feature of the Xbox product, would exceed the total product price.”

Judge Robart’s ruling covers only one part of one patent case â€" a price for reasonable licensing terms on Motorola’s patents. And the case is continuing. But his opinion, said Jorge L. Contreras, an associate professor of law at American University, detailed “some overarching principles that apply in cases like this. He emphasized that there was a social good that should be taken into account, and what is good for the whole market, not just for the two parties involved in the litigation.”

The ruling, Mr. Contreras added, “makes the big picture a lot clearer.”



Disruptions: No Words, No Gestures, Just Your Brain as a Control Pad

Last week, engineers sniffing around the programming code for Google Glass found hidden examples of ways that people might interact with the wearable computers without having to say a word. Among them, a user could nod to turn the glasses on or off. Taking a picture might be accomplished with a single wink.

But don’t expect these gestures to be necessary for long. Soon, we might be interacting with our smartphones and computers simply by using our minds. In the next couple of years, we could be turning on the lights at home just by thinking about it, or sending an e-mail from our smartphone without even pulling the device from our pocket. Further into the future, our robot assistant will appear by our side with a glass of fresh lemonade simply because it knows we’re thirsty.

Researchers in Samsung’s Emerging Technology Lab are testing tablets that can be controlled by your brain, using a cap that resembles a ski hat studded with monitoring electrodes, the MIT Technology Review, the science and technology journal of the Massachusetts Institute of Technology, reported this month.

The technology, often called brain computer interfaces, was conceived to enable people with paralysis and other disabilities to interact with computers or control robotic arms, all by simply thinking about such actions. Before long, these technologies could well be in consumer electronics, too.

Some crude brain-reading products already exist, letting people play easy games or move a mouse around a screen with their mind.

NeuroSky, a company based in San Jose, Calif., recently released a Bluetooth-enabled headset that can monitor slight brain movements and allow people to play concentration-based games on computers and smartphones. These include a zombie-chasing game, archery and a game where you dodge bullets â€" all these apps use your mind as the joystick. Another company, Emotiv, sells a headset that looks like a large alien hand and can read brain waves associated with thoughts, feelings and expressions. The device can be used to play Tetris-like games or search through Flickr photos by thinking about an emotion the person is feeling â€" like happy, or excited â€" rather than searching by keywords. Muse, a lightweight, wireless headband, can engage with an app that “exercises the brain” by forcing people to concentrate on aspects of a screen, almst like taking your mind to the gym.

Car manufacturers are exploring technologies packed into the back of the seat that detect when people fall asleep while driving and rattle the steering wheel to awaken them.

But the products commercially available today will soon look archaic. “The current brain technologies are like trying to listen to a conversation in a football stadium from a blimp,” explained John Donoghue, a neuroscientist and director of the Brown Institute for Brain Science. “To really be able to understand what is going on with the brain today you need to surgically implant an array of sensors into the brain.” In other words, to gain access to the brain, for now you still need a chip in your head.

Last year, a project called BrainGate pioneered by Dr. Donoghue, enabled two people with full paralysis to use a robotic arm with a computer responding to their brain activity. One woman, who had not used her arms in 15 years, could grasp a bottle of coffee, serve herself a drink and then return the bottle to a table. All done by imagining the robotic arm’s movements.

But that chip inside the head could soon vanish as scientists say we are poised to gain a much greater understanding of the brain, and, in turn, technologies that empower brain computer interfaces. An initiative by the Obama administration this year called the Brain Activity Map project, a decade-long research project, aims to build a comprehensive map of the brain.

Miyoung Chun, a molecular biologist and vice president for science programs at the Kavli Foundation, is working on the project and although she said it would take a decade to completely map the brain, companies would be able to build new kinds of brain computer interface products within two years.

“The Brain Activity Map will give hardware companies a lot of new tools that will change how we use smartphones and tablets,” Dr. Chun said. “It will revolutionize everything from robotic implants and neural prosthetics, to remote controls, which could be history in the foreseeable future when you can change your television channel by thinking about it.”

There are some fears to be addressed. On the Muse Web site, an F.A.Q. is devoted to convincing customers that the device cannot siphon thoughts from people’s minds.

These brain-reading technologies have been the stuff of science fiction for decades.

In the 1982 movie “Firefox,” Clint Eastwood plays a fighter pilot on a mission to the Soviet Union to steal a prototype fighter jet that can be controlled by a brain neurolink. But Mr. Eastwood has to think in Russian for the plane to work, and he almost dies when he cannot get the missiles to fire while in the middle of a dogfight. (Don’t worry, he survives.)

Although we won’t be flying planes with our minds anytime soon, surfing the Web on our smartphones might be closer.

Dr. Donoghue of Brown said one of the current techniques used to read people’s brains is called P300, in which a computer can determine which letter of the alphabet someone is thinking about based on the area of the brain that is activated when he sees a screen full of letters. But even when advances in brain-reading technologies speed up, there will be new challenges, as scientists will have to determine if the person wants to search the Web for something in particular, or if she is just thinking about a random topic.

“Just because I’m thinking about a steak medium-rare at a restaurant doesn’t mean I actually want that for dinner,” Dr. Donoghue said. “Just like Google glasses, which will have to know if you’re blinking because there is something in your eye or if you actually want to take a picture,” brain computer interfaces will need to know if you’re just thinking about that steak or really want to order it.



Disruptions: No Words, No Gestures, Just Your Brain as a Control Pad

Last week, engineers sniffing around the programming code for Google Glass found hidden examples of ways that people might interact with the wearable computers without having to say a word. Among them, a user could nod to turn the glasses on or off. Taking a picture might be accomplished with a single wink.

But don’t expect these gestures to be necessary for long. Soon, we might be interacting with our smartphones and computers simply by using our minds. In the next couple of years, we could be turning on the lights at home just by thinking about it, or sending an e-mail from our smartphone without even pulling the device from our pocket. Further into the future, our robot assistant will appear by our side with a glass of fresh lemonade simply because it knows we’re thirsty.

Researchers in Samsung’s Emerging Technology Lab are testing tablets that can be controlled by your brain, using a cap that resembles a ski hat studded with monitoring electrodes, the MIT Technology Review, the science and technology journal of the Massachusetts Institute of Technology, reported this month.

The technology, often called brain computer interfaces, was conceived to enable people with paralysis and other disabilities to interact with computers or control robotic arms, all by simply thinking about such actions. Before long, these technologies could well be in consumer electronics, too.

Some crude brain-reading products already exist, letting people play easy games or move a mouse around a screen with their mind.

NeuroSky, a company based in San Jose, Calif., recently released a Bluetooth-enabled headset that can monitor slight brain movements and allow people to play concentration-based games on computers and smartphones. These include a zombie-chasing game, archery and a game where you dodge bullets â€" all these apps use your mind as the joystick. Another company, Emotiv, sells a headset that looks like a large alien hand and can read brain waves associated with thoughts, feelings and expressions. The device can be used to play Tetris-like games or search through Flickr photos by thinking about an emotion the person is feeling â€" like happy, or excited â€" rather than searching by keywords. Muse, a lightweight, wireless headband, can engage with an app that “exercises the brain” by forcing people to concentrate on aspects of a screen, almst like taking your mind to the gym.

Car manufacturers are exploring technologies packed into the back of the seat that detect when people fall asleep while driving and rattle the steering wheel to awaken them.

But the products commercially available today will soon look archaic. “The current brain technologies are like trying to listen to a conversation in a football stadium from a blimp,” explained John Donoghue, a neuroscientist and director of the Brown Institute for Brain Science. “To really be able to understand what is going on with the brain today you need to surgically implant an array of sensors into the brain.” In other words, to gain access to the brain, for now you still need a chip in your head.

Last year, a project called BrainGate pioneered by Dr. Donoghue, enabled two people with full paralysis to use a robotic arm with a computer responding to their brain activity. One woman, who had not used her arms in 15 years, could grasp a bottle of coffee, serve herself a drink and then return the bottle to a table. All done by imagining the robotic arm’s movements.

But that chip inside the head could soon vanish as scientists say we are poised to gain a much greater understanding of the brain, and, in turn, technologies that empower brain computer interfaces. An initiative by the Obama administration this year called the Brain Activity Map project, a decade-long research project, aims to build a comprehensive map of the brain.

Miyoung Chun, a molecular biologist and vice president for science programs at the Kavli Foundation, is working on the project and although she said it would take a decade to completely map the brain, companies would be able to build new kinds of brain computer interface products within two years.

“The Brain Activity Map will give hardware companies a lot of new tools that will change how we use smartphones and tablets,” Dr. Chun said. “It will revolutionize everything from robotic implants and neural prosthetics, to remote controls, which could be history in the foreseeable future when you can change your television channel by thinking about it.”

There are some fears to be addressed. On the Muse Web site, an F.A.Q. is devoted to convincing customers that the device cannot siphon thoughts from people’s minds.

These brain-reading technologies have been the stuff of science fiction for decades.

In the 1982 movie “Firefox,” Clint Eastwood plays a fighter pilot on a mission to the Soviet Union to steal a prototype fighter jet that can be controlled by a brain neurolink. But Mr. Eastwood has to think in Russian for the plane to work, and he almost dies when he cannot get the missiles to fire while in the middle of a dogfight. (Don’t worry, he survives.)

Although we won’t be flying planes with our minds anytime soon, surfing the Web on our smartphones might be closer.

Dr. Donoghue of Brown said one of the current techniques used to read people’s brains is called P300, in which a computer can determine which letter of the alphabet someone is thinking about based on the area of the brain that is activated when he sees a screen full of letters. But even when advances in brain-reading technologies speed up, there will be new challenges, as scientists will have to determine if the person wants to search the Web for something in particular, or if she is just thinking about a random topic.

“Just because I’m thinking about a steak medium-rare at a restaurant doesn’t mean I actually want that for dinner,” Dr. Donoghue said. “Just like Google glasses, which will have to know if you’re blinking because there is something in your eye or if you actually want to take a picture,” brain computer interfaces will need to know if you’re just thinking about that steak or really want to order it.



Disruptions: No Words, No Gestures, Just Your Brain as a Control Pad

Last week, engineers sniffing around the programming code for Google Glass found hidden examples of ways that people might interact with the wearable computers without having to say a word. Among them, a user could nod to turn the glasses on or off. Taking a picture might be accomplished with a single wink.

But don’t expect these gestures to be necessary for long. Soon, we might be interacting with our smartphones and computers simply by using our minds. In the next couple of years, we could be turning on the lights at home just by thinking about it, or sending an e-mail from our smartphone without even pulling the device from our pocket. Further into the future, our robot assistant will appear by our side with a glass of fresh lemonade simply because it knows we’re thirsty.

Researchers in Samsung’s Emerging Technology Lab are testing tablets that can be controlled by your brain, using a cap that resembles a ski hat studded with monitoring electrodes, the MIT Technology Review, the science and technology journal of the Massachusetts Institute of Technology, reported this month.

The technology, often called brain computer interfaces, was conceived to enable people with paralysis and other disabilities to interact with computers or control robotic arms, all by simply thinking about such actions. Before long, these technologies could well be in consumer electronics, too.

Some crude brain-reading products already exist, letting people play easy games or move a mouse around a screen with their mind.

NeuroSky, a company based in San Jose, Calif., recently released a Bluetooth-enabled headset that can monitor slight brain movements and allow people to play concentration-based games on computers and smartphones. These include a zombie-chasing game, archery and a game where you dodge bullets â€" all these apps use your mind as the joystick. Another company, Emotiv, sells a headset that looks like a large alien hand and can read brain waves associated with thoughts, feelings and expressions. The device can be used to play Tetris-like games or search through Flickr photos by thinking about an emotion the person is feeling â€" like happy, or excited â€" rather than searching by keywords. Muse, a lightweight, wireless headband, can engage with an app that “exercises the brain” by forcing people to concentrate on aspects of a screen, almst like taking your mind to the gym.

Car manufacturers are exploring technologies packed into the back of the seat that detect when people fall asleep while driving and rattle the steering wheel to awaken them.

But the products commercially available today will soon look archaic. “The current brain technologies are like trying to listen to a conversation in a football stadium from a blimp,” explained John Donoghue, a neuroscientist and director of the Brown Institute for Brain Science. “To really be able to understand what is going on with the brain today you need to surgically implant an array of sensors into the brain.” In other words, to gain access to the brain, for now you still need a chip in your head.

Last year, a project called BrainGate pioneered by Dr. Donoghue, enabled two people with full paralysis to use a robotic arm with a computer responding to their brain activity. One woman, who had not used her arms in 15 years, could grasp a bottle of coffee, serve herself a drink and then return the bottle to a table. All done by imagining the robotic arm’s movements.

But that chip inside the head could soon vanish as scientists say we are poised to gain a much greater understanding of the brain, and, in turn, technologies that empower brain computer interfaces. An initiative by the Obama administration this year called the Brain Activity Map project, a decade-long research project, aims to build a comprehensive map of the brain.

Miyoung Chun, a molecular biologist and vice president for science programs at the Kavli Foundation, is working on the project and although she said it would take a decade to completely map the brain, companies would be able to build new kinds of brain computer interface products within two years.

“The Brain Activity Map will give hardware companies a lot of new tools that will change how we use smartphones and tablets,” Dr. Chun said. “It will revolutionize everything from robotic implants and neural prosthetics, to remote controls, which could be history in the foreseeable future when you can change your television channel by thinking about it.”

There are some fears to be addressed. On the Muse Web site, an F.A.Q. is devoted to convincing customers that the device cannot siphon thoughts from people’s minds.

These brain-reading technologies have been the stuff of science fiction for decades.

In the 1982 movie “Firefox,” Clint Eastwood plays a fighter pilot on a mission to the Soviet Union to steal a prototype fighter jet that can be controlled by a brain neurolink. But Mr. Eastwood has to think in Russian for the plane to work, and he almost dies when he cannot get the missiles to fire while in the middle of a dogfight. (Don’t worry, he survives.)

Although we won’t be flying planes with our minds anytime soon, surfing the Web on our smartphones might be closer.

Dr. Donoghue of Brown said one of the current techniques used to read people’s brains is called P300, in which a computer can determine which letter of the alphabet someone is thinking about based on the area of the brain that is activated when he sees a screen full of letters. But even when advances in brain-reading technologies speed up, there will be new challenges, as scientists will have to determine if the person wants to search the Web for something in particular, or if she is just thinking about a random topic.

“Just because I’m thinking about a steak medium-rare at a restaurant doesn’t mean I actually want that for dinner,” Dr. Donoghue said. “Just like Google glasses, which will have to know if you’re blinking because there is something in your eye or if you actually want to take a picture,” brain computer interfaces will need to know if you’re just thinking about that steak or really want to order it.