Archive for the 'Computing & Society' Category


Denmark does telemedicine

A week ago I posted some rather snarky remarks about ther resistance to full cooperation by physicians in a study of using telemedicine to augment treatment of people in ICUs.

In contrast the NYTimes did another article this week about a forthcoming report in the Commonwealth Fund about how telemedicine is  being handled in Denmark. They tell about a 77-year-old man who has respiratory problems from smoking.

…he can go to the doctor without leaving home, using some simple medical devices and a notebook computer with a Web camera. He takes his own weekly medical readings, which are sent to his doctor via a Bluetooth connection and automatically logged into an electronic record.

“You see how easy it is for me?” Mr. Danstrup said, sitting at his desk while video chatting with his nurse at Frederiksberg University Hospital, a mile away. “Instead of wasting the day at the hospital?”

He clipped an electronic pulse reader to his finger. It logged his reading and sent it to his doctor. Mr. Danstrup can also look up his personal health record online. His prescriptions are paperless — his doctors enters them electronically, and any pharmacy in the country can pull them up. Any time he wants to get in touch with his primary care doctor, he sends an e-mail message. […]

Several studies, including one to be published later this month by the Commonwealth Fund, conclude that the Danish information system is the most efficient in the world, saving doctors an average of 50 minutes a day in administrative work. And a 2008 report from the Healthcare Information and Management Systems Society estimated that electronic record keeping saved Denmark’s health system as much as $120 million a year.

The most interesting thing about this, however, is that what is making this possible in Denmark and not in the US is difference in attitude. It’s not about technology.

“It was a natural progression for us,” said Otto Larsen, director of the agency that regulates the system. “We believe in taking care of our people, and we had believed this was the right way to go.” […]

Kurt Nielsen, the [Thy-Mors] hospital’s director, says that while the doctors are not particularly adept at information technology, they have gradually embraced it. And it helps that the staff was involved in developing the innovations.

“My staff at the hospital is very, very satisfied,” he said. “We build these systems in an incremental way, and seek their input throughout.”

Everyone would acknowledge that implementing EHR, telemedicine and other technologies in a  population of 6 million is a much smaller financial and procedural challenge. But when your society (i.e., the US and its health industry) seems to lack “we take care of our people” as a top value, the likelihood of putting truly reachable technological systems in place is nearly insurmountable.


Peering into the generation chasm

In the NYTimes Week in Review this past weekend Brad Stone wrote an article titled: “The Children of Cyberspace: Old Fogies by Their 20s.” He observed that his 2-year-old is already learning touch-screen technologies and Kindle instead of books. His speculation is that “this generation” (i.e., his young daughter) is going to be quite different from kids now in their teens. There are mini-gaps rapidly developing in the way technology affects experience equivalent to the way we used to think about 20-year generation gaps. He quotes Lee Rainie, director of the Pew Research Center’s Internet and American Life Project.

“People two, three or four years apart are having completely different experiences with technology…College students scratch their heads at what their high school siblings are doing, and they scratch their heads at their younger siblings. It has sped up generational differences.”

So that brings me to something that has been on my mind for some time: With the relentless effort to extend human life — and, concomitantly, work-life — how in the  world are current and future generations going to remain relevant enough to have economic value during the whole of their careers?

This is already a big problem. A few months ago I stumbled across a website (that unfortunately I didn’t bookmark) devoted to baby boomers railing about how they are facing age-discrimination. Trailing-edge baby boomers are still in their late forties and need to work for perhaps another 20  to 25 years. But facing both layoffs and re-employment  in 2009 they feel Gen-Xers and Gen-Yers are labeling them as technologically outdated and unfit for today’s jobs. Needless to say the  people participating in the site were very angry about what they perceive they’re facing. Some blame the Mainstream Media for creating false stereotypes of baby boomers as part of some left-wing conspiracy (I don’t know how they reached that conclusion).

I have to say that I think the younger folks have a point. I’m a recently retired baby boomer. I spent a great deal of my personal time and money during my career staying abreast of personal computing and the internet. I can’t say the same for many of my peers. These days  just having “job experience” isn’t enough. To be a strategic leader you’ve not only got to stay abreast of the basic functionality of  technology but also stay in touch the cultural changes that accompany it. Managers with years of traditional experience under their belts need to understand how changing technology can be applied to innovate business operations and to position the organization for future opportunities. Frequently that’s not what “experienced” people do.

It seems to me this problem will only be exacerbated by the accelerating rate of technological change. People in their 20s should not be too smug; they’ll be looking at the issue from the other side soon enough. And even early-career workers need to consider how they’re going to stay mentally flexible enough to absorb new technology and new culture for the several remaining decades of their work-life.


Overcoming big hurdles in cancer research and treatment

I’ve posted a couple of times [here and here] about the complexity of living systems and the challenges that reality presents for medicine. That’s certainly true for cancer.

But the march toward greater information goes on. A Princeton U press release titled “Scientists find way to catalog all that goes wrong in a cancer cell” on describes an advance in algorithms that make defining the pathways of complex genetic interactions. Researchers “were able to systematically categorize and pinpoint the alterations in cancer pathways and to reveal the underlying regulatory code in DNA.”

Researchers Saeed Tavazoie, Hani Goodarzi, and Olivier Elemento say:

“We are discovering that there are many components inside the cell that can get mutated and give rise to cancer…Future cancer therapies have to take into account these specific pathways that have been mutated in individual cancers and treat patients specifically for that.”

The researchers developed an algorithm, a problem-solving computer program that sorts through the behavior of each of 20,000 genes operating in a tumor cell. When genes are turned “on,” they activate or “express” proteins that serve as signals, creating different pathways of action. Cancer cells often act in aberrant ways, and the algorithm can detect these subtle changes and track all of them. […] The algorithm devised by the group scans the DNA sequence of a given cell — its genome — and deciphers which sequences are controlling what pathways and whether any are acting differently from the norm. By deciphering the patterns, the scientists can conjure up the genetic regulatory code that is underlying a particular cancer.

The goal: developing much more specific therapies targeted to these variation.

That’s cool. It’s great to keep digging for the information to understand and treat cancers. But I have to add — having spent many years in the sometimes idealistic realm of cancer control — there’s another giant issue that has reared it’s ugly head in recent years: cost. When I started my career three decades ago nobody was talking about the economic barriers of cancer therapy development and widespread adoption being such a problem. But the ongoing farcical struggle over reasonably equitable health care access for all Americans (much less the rest of the world) demonstrates that how much cancer treatment costs and who pays for it is a formidable problem in “curing cancer” in its own right. Teasing out the information this research suggests is needed and then delivering treatments from it in practical terms to a big population is going to be a long process.


Single atom transistor. Whee!!

Researchers in Finland and Australia have submitted a research paper about their demonstration of a single-atom transistor. The report on says: “Researchers…have succeeded in building a working transistor, whose active region composes only of a single phosphorus atom in silicon.”

Why get excited about a geeky development like this?

I’ve made a big deal about how complex real world systems are such as a bacterium and the genetics of cancer and other bad diseases. The only way we’ll master them eventually is having more and more computing power for elaborate computer models and sensitive devices for looking deep within living things. To my mind these research reports are the early signals of progress forthcoming in development of electronic components, sensors, information storage and communication. These things are going to emerge in the not-too-distant-future.

The rapid development of computers, which created the present information society, has been mainly based on the reduction of the size of transistors. We have known for a long time that this development has to slow down critically during the future decades when the even tighter inexpensive packing of transistors would require them to shrink down to the atomic length scales.

These are not going to be transistors like we have in our laptops and mobiles; they’re going to be transistors for the next generation of computing: quantum computers. To be sure, there are enormous engineering challenges to getting to the next level, but there are frequent reports of developments like this one. It’ll happen and the impact will be great.

Unraveling the complexities of life processes and applying them to our day-to-day concerns is daunting. But developments like the single-atom transistor say to me that the forces of technology are converging to keep us moving ahead. We just need to persist.


Google calling

telephoneEvidently Google is maneuvering to take on telephony head-to-head. This Wired Epicenter blog post reports on Google’s acquisition of a  company called Gizmo5 that does VOIP things using open standards. So Google continues to toss bombs into the traditional business space of phone companies.

I marvel as the audacity of Google to challenge the established boundaries of the digital world. It doesn’t seem to “know its place” in the order of things. Google knows digits and beyond that it’s willing to re-imagine any niche previously dominated by older technologies and institutions. Those managing the establishment are on notice that nothing in the digital realm is sacred.

Too my mind Google and Apple are a couple of the best engines of change operating worldwide today. Technology inserts itself into the social order in innocuous ways like communication and entertainment. But in the long run it becomes the platform for a new social order.


Think really BIG…forever

supercomputerIBM is famous for having had signs around their offices in years past that said “THINK.” Evidently they need to modify the sign to say: “THINK BIG!”  According to a NY Times article the other day, Big Blue and other computing leaders like Google are  concerned that many students currently being trained in college are not being exposed to the tools that will enable them to imagine and execute projects that could be accomplished with the huge data sets and super-computing systems that will be available to them in their professional futures.

These days you can buy a terabyte drive for your PC for a couple hundred bucks. But fields like genomics, astronomy, geology, and medical imaging and modeling process petabytes of data already. And data is scaling exponentially.

But students typically deal with PCs or perhaps clusters of moderate performance. The concern is that those experiences will trap their thinking on that scale.

“If they imprint on these small systems, that becomes their frame of reference and what they’re always thinking about,” said Jim Spohrer, a director at I.B.M.’s Almaden Research Center.

The word “imprint” is what caught my eye. Since my last post about today’s tots living to be 100 I’ve been thinking: “How will this next generation remain flexible enough to be of value over a career that spans perhaps 50 or more years?” In my experience people tend to “imprint” on what they experience in their 20s and 30s. After that, perceiving and being open to change around them can become problematic. In a world that is likely to change at an exponential rate, how do people in their mid- to late-career years continue to be valuable? This has been a problem in highly technical fields like computing for quite a while already.

Living long is going to require more than being physically and mentally healthy; it’s going to call for qualities like creativity, perceptiveness, imagination, flexibility, and resilience over a l0nger span of time than for previous generations.


Full metabolism model. How exciting!

metabolic pathways 1Science published a study today by the Burnham Institute at UC San Diego, The Scripps Institute, and the Novartis Genomics Institute reporting that they have for the first time modeled the central metabolic pathway system of a bacterium, complete with 3-D, atomic resolution overlays of the involved proteins. Exciting, no?

On the Burnham website they say:

Combining biochemical studies, structural genomics and computer modeling, the researchers deciphered the shapes, functions and interactions of 478 proteins that make up T. maritima’s central metabolism. The team also found connections between these proteins and 503 unique metabolites in 562 intracellular and 83 extracellular metabolic reactions.

“We have built an actual three dimensional model of every protein in the central metabolic system,” said Adam Godzik, Ph.D., director of Burnham’s Bioinformatics and Systems Biology program. “We got the whole thing. This is analogous to sequencing an entire genome.”

Here’s a link to a little video on Vimeo about the project.

Developing a solid computer model of any living thing — even a bacterium — is an important step. Computer modeling of airplanes, electronic components, architectural projects and many other things have enabled huge strides in understanding and efficiently designing many things we use today. But the innards of any living thing have been so complex that full modeling has been pretty much beyond reach. What’s exciting about this accomplishment is the prospect of scaling up to model cells, organisms, and even ecosystems. If computer models of cells and organisms prove to be as useful as modeling in other areas has, then this is a bit like putting a rocket engine behind bio-medical research.

Umm, Delicious Bookmarks


RSS The Vortex

  • An error has occurred; the feed is probably down. Try again later.