Archive for the 'Computing & Society' Category

16
Feb
11

one more chance for humans?

I’ve been following with interest the “Jeopardy!” programs featuring Watson, the IBM supercomputer designed to play the game. On the first program Monday Watson took an early lead over the human “Jeopardy!” champs, Ken Jennings and Brad Rutter, but it seemed stumped when the high-dollar questions were reached, and the game ended in a tie between Watson and Rutter. In the second game Watson ran away with the lead.

So what caught my attention this morning was the headline of the article in International Business Times: “Round Two Goes to Watson; Humans Have One More Chance.” So how did the article’s author, Gabriel Pena, mean “one more chance for humans”? One more chance to win “Jeopardy!” ? Or does it mean something more existentially ominous: one more chance for humans before we’re replace by machines smarter than we are?

I don’t think it’s time to panic. And I’m a real skeptic of ideas like the “singularity”: a time in the not-too-distant-future at which computers become so intelligent and superior at controlling systems in our world that we’re irrelevant.

But my wife’s reaction to the prospect of Watson beating some really smart guys is telling. She asked, “So who’s going to loose their job?” Ah, yes! Haven’t we learned a few things during the so-called “Great Recession”? One of them is that people laid off are not being rehired; companies are investing in “productivity” tools rather than jobs. “Productivity” is a code word for doing the same work with fewer jobs.  If you are not the person running the new productivity devices then your job’s in jeopardy for sure. Implementing higher productivity has been a basic economic process for a long time, but Watson’s technology is enough to make the hair stand up on the back of your neck.

Technological productivity advances are like riding on the back of a tiger: if you stay on its back you’re okay, but if you fall off you’re chow. Watson is a productivity system that appears to have made a real stride in being able to take a English language question and parse it into a specific information request better than than earlier question-answering technologies. IBM calls it “open question answering.” The company is going to turn this into commercial procuct and apply it initially in medicine and medical law. It will help researchers or doctors plow through vast stores of unstructured information like journals to come up with answers to their questions more efficiently than anything before.  As and IBM exec, David McQueeney, said to the Washing Post this morning:

“Imagine taking Watson, and instead of feeding it song lyrics and Shakespeare, imagine feeding it medical papers and theses,” he said. “And put that machine next to an expert human clinician.”

With the computer intelligently processing a vast amount of data and the doctor using his or her own professional knowledge to guide and refine a search, McQueeney said, IBM thinks the quality of that diagnosis could be better than what the doctor comes up with alone.

Looks like doctors will be the first up on the back of this tiger.

I have been a skeptic about artificial intelligence for a long time. I’ve been hearing that AI is right around the corner since the ’50s. Lots of claims have been made but, like the rocket-belt, flying car, nuclear fusion, undersea cities, and the cure for cancer, the expected results haven’t been delivered. Watson is by no means and equivalent to human intelligence, but it appears to be an indicator that progress is being made. We’re not about to be made totally obsolete any time soon — if ever — but the “second chance” for many of us is to stay abreast of this emerging technology and use it as our tool rather than have it put us behind the 8-ball.

23
Nov
10

Single molecule computer chip

Mr. McGuire: I want to say one word to you. Just one word.
Benjamin: Yes, sir.
Mr. McGuire: Are you listening?
Benjamin: Yes, I am.
Mr. McGuire: Plastics.
Benjamin: Just how do you mean that, sir?

In the 1967 movie, The Graduate, a family friend, Mr. McGuire, offers Benjamin (Dustin Hoffman) just one word of advice to set him on a path to future success: “plastics.” That was more than 40 years ago. If I were to adopt a one-word recommendation for the emerging generation it would be: “nanotech.” I’ve mentioned this before.

I was reminded again last week about how dramatic the development in the science and technology of billionth-of-a-meter devices is. Singapore’s Agency for Science, Technology and Research (A*STAR) Institute of Materials Research and Engineering announced a partnership with 10 European Union organizations for the ATMOL project–an effort to build a single molecule computer processor.

A*STAR’s IMRE and 10 EU research organisations are working together to build what is essentially a single molecule processor chip. As a comparison, a thousand of such molecular chips could fit into one of today’s microchips, the core device that determines computational speed. The ambitious project, termed Atomic Scale and Single Molecule Logic Gate Technologies (ATMOL), will establish a new process for making a complete molecular chip. This means that computing power can be increased significantly but take up only a small fraction of the space that is required by today’s standards.

The R&D will work on some cutting-edge techniques for creating molecular components: “The fabrication process involves the use of three unique ultra high vacuum (UHV) atomic scale interconnection machines which build the chip atom-by-atom. These machines physically move atoms into place one at a time at cryogenic temperatures.”

But here’s the thing about this path to success: How do you sustain a career in a field where your current knowledge is as evanescent as the morning dew? Riches will be made in nanotechnology, but knowledge obsolescence has been a problem for mid-career technicians in IT for decades. I don’t see how it’ll get any better.

30
Oct
10

A lesson in globalism with a local twist

When I moved up here near Portland several months ago I’d heard that it had what they called the “Silicon Forrest.” To the west of Portland is a nest of hight-tech companies including Nike, HP, Intel, Sun and quite an impressive list of others. And, in the last few weeks there has been a lot of excitement from the announcement by Intel that was going to expand its Hillsborough manufacturing and testing plant. Hillsborough is about 5 mi west of where I live.

In this down economy the announcement of an $8 billion plant to support its cutting-edge 22 nm features for processors is fantastic news. So the latest, greatest Intel CPUs will be made here for several years. It’ll mean somewhere between 6,000 and 8,000 construction jobs over the next two years and about 1,000 permanent technical jobs. Sweet!

A lot of discontent in the US these days is over how: “our jobs have been shipped overseas!” The local Intel plant runs counter to that trend,  so I was somewhat surprised to see an announcement in my Oregonian this morning that Intel’s CEO, Paul Otellini, was in Ho Chi Minh City, Vietnam, yesterday for a ceremony officially opening a $1 billion plant there. And earlier in the week he cut the ribbon on an Intel manufacturing plant in Dalian, China. Gee, do you remember when we were at war with North Vietnam and regularly bombing the crap out of Ho Chi Minh City? I sure do.

But it’s illustrative of how global commerce works. A lot of Americans seem to think that companies with US headquarters and names that are household words somehow have an obligation to be here and help employ Americans. Nah, that’s not how it works. These are not American corporations; they’re global and they go anywhere in the world to get the lowest labor costs and deals on the rest of what it takes to make their products. If you want to make money in the US invest in Intel stock. They don’t see the world and their purpose in nationalistic terms. A lot of Americans need to adjust their thinking to fit with that reality.

11
May
10

another volley in the healthcare revolution

The Washington Post is reporting today that a company called Pathway Genomics on is going to start selling through Walgreen’s 6,000 drugstores an over-the-counter kit for testing certain genetic traits.

Beginning Friday, shoppers in search of toothpaste, deodorant and laxatives at more than 6,000 drugstores across the nation will be able to pick up something new: a test to scan their genes for a propensity for Alzheimer’s disease, breast cancer, diabetes and other ailments

The test also claims to offer a window into the chances of becoming obese, developing psoriasis and going blind. For those thinking of starting a family, it could alert them to their risk of having a baby with cystic fibrosis, Tay-Sachs and other genetic disorders. The test also promises users insights into how caffeine, cholesterol-lowering drugs and blood thinners might affect them.

Yeow, that’s going to set off a firestorm! A couple of years ago when companies like 23andMe began to offer tests to consumers the California and New York public health departments and the FDA tried to shut them down. They issued “cease and desist” orders and threatened to charge them with violating various violations of business practice laws. In fact the kerfuffle has already started.

The Food and Drug Administration questioned Monday whether the test will be sold legally because it does not have the agency’s approval. Critics have said that results will be too vague to provide much useful guidance because so little is known about how to interpret genetic markers.

The medical profession is conservative with good reason: lives are at stake. But in all this, in my opinion, is also a component protection of professional prerogatives. Professions in any field don’t give ground to the ordinary person easily.

I’ve had some experience with this. When I started in the cancer field 36 years ago we had two sets of printed literature: one set for the lay public and another for doctors and nurses. You were risking getting fired if you let a cancer patient get hold of the professional literature! The reasons then were the same ones physicians express now about internet information: “they (the public) won’t understand what it means; they will misinterpret it; they’ll suffer anxiety; they might make bad decisions about treatment.” But the internet irreversibly smashed the barrier to access to professional medical information. Doctors are still fighting a rear-guard action and complaining mightily about how it was better in the old days when they were the exclusive source of medical information. I’ve commented on that before.

I’m not dismissing the concerns. No doubt there will unfortunate incidents around these new tests. But what gets me is how unwilling the medical profession is to see the revolution of information that is underway and to rethink the medical paradigm. My pleas is for physicians to start — as a profession — to work on a more equitable and flexible basis with the citizens who want a greater and more equal role in their medical life. We’ll always have a doctor/patient relationship, but I think its going to be much different in the not far distant future.

The internet isn’t going away; instead it’s going to go much, much deeper into our health lives. And genetic tests are not going away either. Like it or not, deep personal knowledge about what lurks in our genes is on the way. Why isn’t the medical profession working with entrepreneurs, patients, futurists, and internet gurus to anticipate what’s coming and do something positive that works for everyone? There’s much, much work to be done, and soon. Without a collaborative movement of innovation and adaptation we’re going to suffer through repeated, time-wasting bouts of friction.

17
Apr
10

the paradigm for the genetics of complex diseases is changing

The structure of part of a DNA double helix

Image via Wikipedia

One of the themes of this blog is that living things are complex and making clinical gains from areas of research such as genetics is just plain hard. There’s been a lot of questioning of genetic research lately, but, as I’ve tried to point out, there are many factors other than plain ol’ DNA involved in finding the way genes manifest in disease. That basic situation got a better expectation this past week when two highly respected genetics researchers at the University of Washington, Mary-Claire King and John McClellan, published an essay in Cell titled, “Genetic Heterogeneity in Human Disease.”

For decades the basic genetics paradigm held that common diseases are caused by common variants (CDCV). That is, to look for genetic causes for cancers the reasonable thing would be to identify genetic variations (mutations) found most often in cancer cases. That makes sense, but it turns out that finding these common genetic variations is not enough to explain all the disease. King and McClellan say:

…from the perspective of genetics, we suggest that complex human disease is in fact a large collection of individually rare, even private, conditions…In molecular terms, we suggest that human disease is characterized by marked genetic heterogeneity, far greater than previously appreciated. Converging evidence for a wide range of common diseases indicates that heterogeneity is important at multiple levels of causation: (1) individually rare mutations collectively play a substantial role in causing complex illnesses; (2) the same gene may harbor many (hundreds or even thousands) different rare severe mutations in unrelated affected individuals; (3) the same mutation may lead to different clinical manifestations (phenotypes) in different individuals; and (4) mutations in different genes in the same or related pathways may lead to the same disorder.

There’s a huge idea here: Complex human diseases involve sets of complex genetic variations, so many, in fact, that each person’s case of a disease may have individual characteristics. We accept the idea that each individual is unique, but it’s perhaps surprising to think that your case of cancer, for instance, may bear individual characteristics.

The overall magnitude of human genetic variation, the high rate of de novo mutation, the range of mutational mechanisms that disrupt gene function, and the complexity of biological processes underlying pathophysiology all predict a substantial role for rare severe mutations in complex human disease. Furthermore, these factors explain why efforts to identify meaningful common risk variants are vexed by irreproducible and biologically ambiguous results.

Next-generation sequencing provides its own challenges. Whole-genome sequencing strategies detect hundreds of thousands of rare variants per individual (McKernan et al., 2009). Biological relevance must be established before a mutation can be causally linked to a disorder. The critical question is not whether cases as a group have more rare events than controls; but rather which mutation(s) disrupting a gene is responsible for the illness in the affected person harboring the variant. Variable penetrance, epistasis, epigenetic changes, and gene-environment interactions will complicate these efforts. It will be fun to sort out. [Emphasis mine.]
So, as I’ve remarked before, life is complicated. Living systems are the most complex things we know of in the universe, and we’re only now beginning to explore them in detail. We want results to save us now! But it’s going to be some time before we fully understand diseases like cancer and then a long time ’till effective therapies are widely available. Moreover, we have no idea what it’s all going to cost, and, as our recent rancorous debate on health care demonstrates, cost is no trivial matter.
11
Mar
10

The family genes

I’ve written several posts about how there’s been a lot of criticism this year of the meager results of gene sequencing in finding therapies for diseases. The genetic keys to diseases have proven elusive to the point there has been discouragement in the field. But there’s perhaps a more positive note in today’s NY Times about two studies being published in journals on Friday. For the last decade the operating assumption of genetics and disease is that common diseases like cancer come from common mutations in genes. But a lot of tests on the connection between genetic mutations commonly seen and common diseases was not strong. Instead the conclusion has been emerging that diseases are really linked to rare mutations. So all those news headlines you’ve seen over the last 10 years of so that declare “gene for depression found” were wrong. It’s not that simple.

For three diseases — Charcot-Marie-Tooth disease, Miller syndrome and ciliary dyskinesia — it turns out that the genetic inheritance comes from more obscure genetic changes by way of Mendelian family inheritance. The studies sequenced the whole genomes of not only the children with expressions of the disease but the parents as well. So they got what you might call the whole-family genome. Identifying diseases that manifest differently depending on the mix of genes coming from mom and dad means that the genomes of the whole troop might be needed.

Fortunately the cost of doing a whole genome is dropping, fast. Complete Genomics of Mountain View, Calif., did the genomes in one of the studies at $25,000 each. That’s a whole lot better than the $3 billion for the first genome ten years ago. They’re promising the $10,000 genome to be followed by the $5,000 genome. Remember, the holy grail is $1,000.

I said in my previous post about the 21st century medical model that our personal health record will need to contain our whole genome. This suggests that linking the genomes of the rest of the family will make the assessments of lifetime disease risk a lot better.

Reblog this post [with Zemanta]
07
Mar
10

Health paradigm for the 21st C, part 2

Okay, Part 1 of this post was precipitated by the Society of Participatory Medicine’s request for ideas about what members would like to see them do. I talked about my take on the whys and wherefores of participatory medicine. This post is a list of eight activities I’d like to see supported by the Society for Participatory Medicine:

1. Develop an actionable plan for the goal of enabling each individual to become his or her own primary care authority for 90%-95% of health incidents.

Primary care docs want to go specialist because it pays more, so why not elevate the individual to the primary care provider and boost the physician to the role of spcecialist involved as needed? A few months ago during the health care debate on The Health Care Blog I saw a remark (by a physician, as I recall) that about 80% of health events are handled by the individual: cuts and minor trauma, headaches, colds and flu, aches and pains, nutrition, supplements, upset GI, menstral issuses, and on and on. The “drugstore” often the supply center for first-line of public treatment. What if that percentage could be elevated to with the right tools and support to 90% or 95%?

2. Develop a plan for building well-developed, well-funded information support systems specifically to support lifelong personalized health learning and decision making.

The internet is little more than a platform for informaiton storage and cheap distribution with content kluged together from unrelated sources.  However, people have already adopted the internet as a primary source for health information (Pew Internet Surveys). But so far there is no well-funded health resource base specifically designed to achieve anything like the goal above. The internet is a hodgepodge of sites and information of variable quality. WebMD and other commercial sites provide general content as part of their marketing platfrorm. Wikipedia is one crowd-sourced way to compile informaiton, but its quality has been challenged and the whole enterprise criticized. Medepedia, with content from academics from reputable institutions, arose pretty much to be an authoritative alternative to the noise of internet health information, but it’s primarily a reference work and does not seem to have figured out the public involvement part. There are thousands of nonprofit and government sites with bits and pieces of information, but there is no sign of a national commitment to an architecture designed to empower the public with knowledge in a person-specific or engaging way. The only site I am aware of that seems withing striking distance of the comprehensiveness of necessary is the National Library of Medicine.  Their Mediline and PubMed resources might be a precursor to a more innovative way of supporting personal medicine.

The information from a well-designed and well-networked system should contain a mechanism that helps everyone understand what medical information is “evidence-based” and what the certainty level  of current evidence is. The substantiation of information should be on a dynamic, constantly-updated basis. The system should also help people learn that scientific process works toward greater certainty over time and grey areas with less than 100% proof are a necessary part of understanding medicine.

3. The integrated health knowledge network suggested in iten 2 should take a systems approach to human biology and medicine.

In the 20th century the human organism was disassembled for study by segmentation and reductionism. Specialized areas of medicine, nonprofit organizations, and governmental expert agencies took off in their own directions too. The result is a very fragmented picture of health that still dominates today. Knowledge supporting personal health engagement should put the puzzle of health together. The knowledge base of health and life education should follow guidelines that support clarification of how various sub-systems of the human organism play a part in the function or malfunction of the whole.

4. The approach to participatory medicine should be founded on the principle that learning about health is a lifelong matter.

Information should be communicated and made available on an as-needed or just-in-time basis throughout life but within a cohesive systems framework. As I pointed out in an earlier post, parents are beginning to accumulate and electronically record information about children at birth. With the cost of full genome sequencing plummeting it is likely that the process will eventually become routine at birth. It does not seem out of the question that health knowledge can start at birth with a full family genome and health history as a basis for baseline health assessment and risk estimation.

From the outset, children are curious about their bodies and many teachable moments are possible if appropriate information is provided in a personalized, situation-specific way. A whole range of age-appropriate information should utilize current and future technology to find innovative ways of interfacing health information with many learning opportunities throughout life. Games, avatars, social networks, and virtual environments could be employed to engage various groups. People cannnot and need not become experts in all aspects of medicine, but over time they can become experts about themselves and the health matters that are issues for them as indicated by genomic data, family history, race and cultural variables. Needless to say, a health support information system will need to have as its mission staying abreast of and innovating with emerging technology.

5. Facilitate the evolution of an open sytem of quantifying sensors and devices that measure many aspects of bodily function, health status, fintess, and consumption that can be seamlessly integrated with the knowledge network, EHRs and informed by personalized health models.

The problem with life is that we are born without a “dashboard” for our bodies and with no operating manual. When health problems arise the symptoms such as pain, swelling, and other sensations are often too late to prevent acute illness. And our bodies provide few perceptible clues about the percursors of chronic conditions.

Health 2.0 activity has shown that there are many entrepreneurs eager to supply devices and services related to a personal approach to health. But technology standards committees need to be established or coordinated so that devices and data supporting participation can avoid what has happened in the electronic medical record industry. Interoperability and integration are essential, and the particpatory movement will be inhibited if these characteristics are not incorporated from the outset. Open data standards, open applications, and open media standards are necessary to put together the systems of communication, data recording and transmision, security, and social networking that are sub-systems of the greater vision.

The price of admission for entrepreneurs for participatory medicine should be open standards all around. Consumers should be advised not to support products that cannot be integrated with other components of the greater system (motto: “Homie don’t play ‘dat”). An encouraging development in this regard is the Open Mobile Health Exchange . Nevertheless, ongoing advocacy in needed to keep standards open.

6. Drive a counter-culture movement that encourages the US population to reset its expectations of the market economy from tollerance of the current state of health irresponsibility to one of health-benefit.

The market system in the US is health-indifferent; it is not accountable for focusing on consumer products that are designed to exploit basic cravings regardless on long-term personal or societal health burdens. In fact health corruption and health correction are complementary streams of income. Billions of dollars are spent on the design and marketing of products that contiribute to illness only to be answered by products and services marketed to compensate and bring us back toward health. It’s an amazing wealth engine where the right and left hands wash each other.

The weird thing about health “responsibility” in US society is that, with regard to food and drink, only consumers, not producers of goods, are considered responsible. If we over-consume a product designed and marketed to maximize our consumption, the producer is not held accountable. That’s the way it used to be with tabacco, but we changed the preception of responsibility about tobacco between the 1970s and the end of the last century.

A similar cultural change is needed about food and drink. We have a start;  producers of suggary cereals and high fructose corn syrup drinks have been criticized for marketing them to children. Similar accountability — or at least  social scorn — is necessary for other consumables. Producers have gotten away with saying, “Hey, we don’t force you to drik all that corn syrup. It’s your fault, not ours.” Perhaps as the extreme cost in dollars to US society from obesity and its consequences generates even more pain we’ll be less willing to swallow the denial of culpabiity that the marketplace hides behind.

7. Advocate for the funding developemntof human biological system models that can be personalized so that a constant stream of information may be analyzed and used as a source of near-real-time feedback about our health status and behavior.

We need sophsticated human systems biology and computer health models based to the best scientific information. They should be designed so that health data from our genomes, family history, lifetime health history, and from daily activity can be combined to form a personalized profile or algorithm. Our own model — embodied perhaps as an avatar — could be constantly available to interpret data and give us feedback or status reports. Such personalized models could also set the appropriately personal context for health information and learning.

8. Work to support augmented reality development for an environment that will enable us to get information on-the-fly about what our options are for the things we eat and drink.

Institutional support is needed to creating an augmented reality environment of information for restaurants and markets via databases that support easy access to informaiton about what we’re consuming. Bar codes, wi-fi, Bluetooth,  RFID tags  and new future technology should allow smartphones to immediately obtain information about the nutritional content of meals in restaurants and packaged products in markets. I already use an app called “FoodScanner” that uses the iPhone camera to scan package barcodes, look them up on a remote database, and provide me with the nutrition information food products are required to have on the package. The information can be saved for future use, but the whole process is pretty klutzy. A system that automatically grabs infomation and checks it against a personal profile of stuff to avoid is not hard to imagine.

When I was  in school at ~13-years-old we had “hygiene” class in which we had to learn the parts of the body (“pipes and plumbing,” as it was known) and their functions. Then in high school we boys got movies and slide shows with “the coach” to graphically show how disgusting VD and pregnancy are. That was supposed to deter us from sex until marriage. It was also  all I got from public education about health. I suppose it was somehow supposed to enable me to maintain my health for life.

The steps I outlined above is, I hope, a more robust approach and consistent with technology and lifestyles of the near future. The iGeneration evidently no longer sees a reason to fill their heads with generalized infomation with less that obvious personal applicability. They already know they have the option of getting appropriate information at the time it’s needed. Perhaps they’re already aware that the infomation they’ll be exposed to during their lives will be changing constantly. Making this situation lend itself to a healthier population is going to require many elements working together.

The things I’ve suggested also are simply ideas for a long-term process. If there’s one thing I’ve learned from a career in public health it is that change tends to be a lengthy, nonlinear process requiring tolerance for uncertainty and unexpected developments. Change is a career, not a project.

Reblog this post [with Zemanta]



Umm, Delicious Bookmarks

Archives

RSS The Vortex

  • An error has occurred; the feed is probably down. Try again later.