Thursday, April 13, 2017

Friday Thinking 14 April 2017

Hello all – Friday Thinking is a humble curation of my foraging in the digital environment. My purpose is to pick interesting pieces, based on my own curiosity (and the curiosity of the many interesting people I follow), about developments in some key domains (work, organization, social-economy, intelligence, domestication of DNA, energy, etc.)  that suggest we are in the midst of a change in the conditions of change - a phase-transition. That tomorrow will be radically unlike yesterday.

Many thanks to those who enjoy this.
In the 21st Century curiosity will SKILL the cat.
Jobs are dying - work is just beginning.

“Be careful what you ‘insta-google-tweet-face’”
Woody Harrelson - Triple 9

Content
Quotes:

Articles:
Japanese man is first to receive 'reprogrammed' stem cells from another person



...legal reasoning is a much more supple exercise. Individual judges must resolve knotty questions under conditions of uncertainty, and in a context in which there’s usually profound disagreement about both what has happened and what ought to be done about it.

In these circumstances, imagination performs many salutary functions. Indeed, legal reasoning would be impossible without it. Imagination allows judges to explore what might be at stake in any particular dispute, and to provide a set of resources for future decision-makers. It lets them communicate doubt and express hesitation. And it brings the language of law alive, moving us and inviting us to imagine further – and so enables a thriving, interactive community of enquiry.

Of course, imagination also carries certain dangers. It might encourage bias, or signal a departure from common sense. But overall it should be celebrated – in law and, perhaps, in other domains where people must engage in the messy business of public reasoning.

The legal imagination



“Each generation imagines itself to be more intelligent than the one that went before it, and wiser than the one that comes after it.” – George Orwell

What will happen as we enter the era of human augmentation, artificial intelligence and government-by-algorithm? James Barrat, author of Our Final Invention, said: “Coexisting safely and ethically with intelligent machines is the central challenge of the twenty-first century.”

Shall we then demand that our children and grandchildren — perhaps a bit augmented and smarter than us, but certainly vastly more knowledgeable — ought to follow blueprints that we lay down? Like Cro-Magnon hunters telling us never to forget rituals for propitiating the mammoth spirits? Or bronze age herdsmen telling us how to make love?

Ben Franklin and his apprentices led a conspiracy against kings and priests, crafting systems of accountability not in order to tell their descendants how to live, but in order to leave those later citizens the widest range of options. It is that flexibility — wrought by free speech, open inquiry, due process and above all reciprocal accountability — that lent us our most precious sovereign power. To learn from mistakes and try new things, innovating along a positive-sum flow called progress.

David Brin - Preparing for our posthuman future of artificial intelligence




The working schedules of the greatest artists, philosophers, and thinkers can seem highly specific, indulgent, and even partly ludicrous when set against the prevailing 40-plus-hour workweek that has become the routine schedule for our species in the past century. Even more, the idea that many of humanity’s greatest achievements were produced by people who worked no more than four hours a day seems not only counterintuitive, but impossible.

Not so, according to a towering body of evidence in support of workdays that are shorter and more thoughtfully designed in accordance with what we know about the science of productivity.

The Ideal Workweek, According to Science



The “black box” problem is endemic in deep learning. The system isn’t guided by an explicit store of medical knowledge and a list of diagnostic rules; it has effectively taught itself to differentiate moles from melanomas by making vast numbers of internal adjustments—something analogous to strengthening and weakening synaptic connections in the brain. Exactly how did it determine that a lesion was a melanoma? We can’t know, and it can’t tell us. All the internal adjustments and processing that allow the network to learn happen away from our scrutiny.

As is true of our own brains. When you make a slow turn on a bicycle, you lean in the opposite direction. My daughter knows to do this, but she doesn’t know that she does it. The melanoma machine must be extracting certain features from the images; does it matter that it can’t tell us which? It’s like the smiling god of knowledge. Encountering such a machine, one gets a glimpse of how an animal might perceive a human mind: all-knowing but perfectly impenetrable.

“Imagine pitting a baseball player against a physicist in a contest to determine where a ball might land,” he said. “The baseball player, who’s thrown a ball over and over again a million times, might not know any equations but knows exactly how high the ball will rise, the velocity it will reach, and where it will come down to the ground. The physicist can write equations to determine the same thing. But, ultimately, both come to the identical point.”

A.I. VERSUS M.D. - What happens when diagnosis is automated?



Get out of your silo. If you can’t figure out the societal and cultural implications of what you’re doing, start seeking out people who might, and start systematically having lunch with them. And then invite the most interesting ones into your lab with the goal of them becoming partners.

50 Grand Challenges for the 21st Century




This is a MUST VIEW a new Online Journal - signalling the a paradigm shift in the future of science publication suitable to the digital environment - at last transcending the printing press - interactive, visual, embedded data and more.

Machine Learning ResearchShould Be Clear, Dynamic and Vivid. Distill Is Here to Help.

A modern medium for presenting research
The web is a powerful medium to share new ways of thinking. Over the last few years we’ve seen many imaginative examples of such work. But traditional academic publishing remains focused on the PDF, which prevents this sort of communication.

New ways of thinking enable new discoveries
New notations, visualizations, and mental models can deepen our understanding. By nurturing the development of such new ways of thinking, Distill will enable new discoveries.

Machine learning needs more transparency
Machine learning will fundamentally change how humans and computers interact. It’s important to make those techniques transparent, so we can understand and safely control how they work. Distill will provide a platform for vividly illustrating these ideas.

Legitimacy for non-traditional research artifacts
Many researchers want to do work that is not easily contained within a PDF, but can’t get the support they need to pursue it. We, as the research community, are failing them by not treating these wonderful, non-traditional research artifacts as “real” academic contributions.


This is a new Canada 2020 paper - well worth the read.
“In the post-fact era of mistrustful populism here is a compelling case for governments to double down on evidence, Big Data and predictive analytics. Lenihan and Pitfield argue that “civil analytics” holds the alluring promise of delivering powerful cures for today’s most wicked policy problems. It is a promise of effective, value-for- money government that everyone should welcome.”
Giles Gherson
DM, Ontario Ministry of Economic Development + Growth Ontario

The Rise of Civil Analytics: How Big Data is About to Explode Policymaking As We Know It

This is the first of a series of papers Canada 2020 will release on data and policymaking. In it, Tom Pitfield and Don Lenihan explain the shifts that will occur thanks to massive amounts of high quality data and a new capacity for data analytics. Using the right analytics tools, and involving the right leaders, could be considered an answer to the postfact politics that seem to be rising up all around us.

Civil Analytics, as Pitfield and Lenihan have defined the term, is a holistic approach to data, the tools that can be used to analyze it, and the various people who should be engaged to examine it. As agencies and individuals with various interests are included in the process of understanding the data and creating policies, they will feel a greater ownership over them, which results in easier adoption.

Pitfield and Lenihan have big things to say about history, technology and politics, and this essay should be of interest to anyone watching where policy is going and what big trends are on the horizon.
The paper can be downloaded.


While the hype around Big Data is very large - that doesn’t mean there’s isn’t a deep reality the is nearing with accelerating speed. This is an interesting article from Nature - talking about the inevitable way it will transform and disrupt manufacturing. What the Internet as platform and Big Data enable is a shift to the Coordination Economy and the collapse of many traditions transaction costs. This is also a signal warning of disruptions to many traditional research models and organizational architectures.
Accomplishing all this requires intellectual effort, money and better collaboration between industry, academia and government. In the past five years, several regional initiatives for smart manufacturing have been set up. But their goals are mainly directed by industry, and the more general aspects of data and modelling receive little attention.

Smart manufacturing must embrace big data

Study and model industrial processes to save money, energy and materials, urges Andrew Kusiak.
Manufacturing is getting smart. Companies are increasingly using sensors and wireless technologies to capture data at all stages of a product's life. These range from material properties and the temperatures and vibrations of equipment to the logistics of supply chains and customer details. Truck engines beam back data on speed, fuel consumption and oil temperature to manufacturers and fleet operators. Optical scanners are used to spot defects in printed electronics circuits.

But big data is a long way from transforming manufacturing. Leading industries — computing, energy and aircraft and semiconductor manufacturing — face data gaps. Most companies do not know what to do with the data they have, let alone how to interpret them to improve their processes and products. Businesses compete and usually operate in isolation. They lack software and modelling systems to analyse data.

Yet smart manufacturing can make industry more efficient, profitable and sustainable. Minimizing the distances over which products and components are transported reduces costs — financial and environmental. Computer modelling can identify risks and pinch points. It can, for example, anticipate impacts from deliveries delayed by extreme weather, a situation that affect the electronics industry after Thailand's major floods in 2011. Similarly, predicting when a car component is likely to fail and fixing it swiftly avoids expensive recalls and litigation. In industries with low yields, such as semiconductor manufacturing, halving production errors substantially increases profits.


Big Data can be critiqued by a claim of the death of Moore’s Law - this is another signal to file under “Moore’s Law is Dead - Long Live Moore’s Law”.
because Google designed the chip specifically for neural nets, it can run them 15 to 30 times faster than general purpose chips built with similar manufacturing techniques.

Building an AI Chip Saved Google From Building a Dozen New Data Centers

GOOGLE OPERATES WHAT is surely the largest computer network on Earth, a system that comprises custom-built, warehouse-sized data centers spanning 15 locations in four continents. But about six years ago, as the company embraced a new form of voice recognition on Android phones, its engineers worried that this network wasn’t nearly big enough. If each of the world’s Android phones used the new Google voice search for just three minutes a day, these engineers realized, the company would need twice as many data centers.

At that time, Google was just beginning to drive its voice recognition services with deep neural networks, complex mathematical systems that can learn particular tasks by analyzing vast amounts of data. In recent years, this form of machine learning has rapidly reinvented not just voice recognition, but image recognition, machine translation, internet search, and more. In moving to this method, Google saw error rates drop a good 25 percent. But the shift required a lot of extra horsepower.

Rather than double its data center footprint, Google instead built its own computer chip specifically for running deep neural networks, called the Tensor Processing Unit, or TPU. “It makes sense to have a solution there that is much more energy efficient,” says Norm Jouppi, one of the more than 70 engineers who worked on the chip. In fact, the TPU outperforms standard processors by 30 to 80 times in the TOPS/Watt measure, a metric of efficiency.

Google has used the TPU for a good two years, applying it to everything from image recognition to machine translation to AlphaGo, the machine that cracked the ancient game of Go last spring. Not bad—especially considering all the data construction it helped avoid in the process.


This is a very interesting article - worth looking at. The answers are brief.

50 Grand Challenges for the 21st Century

We asked experts from the world of science and technology to describe the societal challenges that they think matter in 2017 and beyond. Read the full list of responses below.
Over the past month, Future Now has been covering the “grand challenges” we face as a society in a series of articles, videos and graphics. We polled a panel of people from various fields about the vital issues they believe deserve more attention – you can browse 50 of those responses below, which we’ll continue to draw on throughout this year. There’s a lot to digest in one sitting – so dip in, reflect, come back...


This is a longish article - but worth the read for anyone interested in the future of humanity - beyond what we currently think of humanity.

The Post-Human World

A conversation about the end of work, individualism, and the human species with the historian Yuval Harari
Harari’s previous work, Sapiens, was a swashbuckling history of the human species. His new book is another mind-altering adventure, blending philosophy, history, psychology, and futurism. We spoke recently about its most audacious predictions. This conversation has been edited for concision and clarity.


This is a Must read article for anyone wanting a clear explanation of the origins of CRISPR and how it works in the wild - all part of the ongoing efforts to domesticate DNA.
Researchers now know there are a confetti-storm of different CRISPR systems, and the list continues to grow. Some are simple — such as the CRISPR/Cas9 system that’s been adapted for gene editing in more complex creatures, and some are elaborate, with many protein workhorses deployed to get the job done.

CRISPR had a life before it became a gene-editing tool

Natural CRISPR systems immunize bacteria from invading viruses, and more
It is the dazzling star of the biotech world: a powerful new tool that can deftly and precisely alter the structure of DNA. It promises cures for diseases, sturdier crops, malaria-resistant mosquitoes and more. Frenzy over the technique — known as CRISPR/Cas9 — is in full swing. Every week, new CRISPR findings are unfurled in scientific journals. In the courts, universities fight over patents. The media report on the breakthroughs as well as the ethics of this game changer almost daily.

But there is a less sequins-and-glitter side to CRISPR that’s just as alluring to anyone thirsty to understand the natural world. The biology behind CRISPR technology comes from a battle that has been raging for eons, out of sight and yet all around us (and on us, and in us).

The CRISPR editing tool has its origins in microbes — bacteria and archaea that live in obscene numbers everywhere from undersea vents to the snot in the human nose. For billions of years, these single-celled organisms have been at odds with the viruses — known as phages — that attack them, invaders so plentiful that a single drop of seawater can hold 10 million. And natural CRISPR systems (there are many) play a big part in this tussle. They act as gatekeepers, essentially cataloging viruses that get into cells. If a virus shows up again, the cell — and its offspring — can recognize and destroy it. Studying this system will teach biologists much about ecology, disease and the overall workings of life on Earth.

But moving from the simple, textbook story into real life is messy. In the few years since the defensive function of CRISPR systems was first appreciated, microbiologists have busied themselves collecting samples, conducting experiments and crunching reams of DNA data to try to understand what the systems do. From that has come much elegant physiology, a mass of complexity, surprises aplenty — and more than a little mystery.


This is an interesting article signalling the emerging application and applicability of genetic sequencing to make precise medicine that works more effectively with population diversity.

How the genomics revolution could finally help Africa

New investments promise to get precision medicine and precision public health off the ground. But experts debate how much work needs to be done first.
It took a public-health disaster for the Zimbabwean government to recognize the power of precision medicine. In 2015, the country switched from a standard three-drug cocktail for HIV to a single-pill combination therapy that was cheaper and easier for people to take every day. The new drug followed a World Health Organization recommendation to incorporate the antiretroviral drug efavirenz as a first-line therapy for public-health programmes. But as tens of thousands of Zimbabweans were put onto the drug, reports soon followed about people quitting it in droves.

Collen Masimirembwa, a geneticist and founding director of the African Institute of Biomedical Science and Technology in Harare, was not surprised. In 2007, he had shown that a gene variant carried by many Zimbabweans slows their ability to break down efavirenz. For those with two copies of the variant — about 20% of the population — the drug accumulates in the bloodstream, leading to hallucinations, depression and suicidal tendencies. He had tried to communicate this to his government, but at the time efavirenz was not a staple of the country's HIV programme, and so the health ministry ignored his warnings.

Enter 'precision public health' — a new approach to precision medicine that bases health decisions on populations and communities rather than on individuals. It would use genomic insights into a country's population to inform general treatment programmes. For instance, a country might tweak its essential medicines list that specifies the drugs it buys in bulk at reduced rates from pharmaceutical companies, to avoid medicines that are known to cause problems in its population.

This is already happening in some places. Botswana — a middle-income country — stopped using the three-in-one drug containing efavirenz in 2016, opting instead for a newer and better-performing, but more expensive, drug called dolutegravir. The gene variant that causes problems with efavirenz is common in Botswana — around 13.5% of the population has two copies of it. And in 2015, Ethiopia banned the use of the painkiller codeine, because a high proportion of people in the country carry a gene variant that causes them to rapidly convert the drug into morphine, which can cause breathing problems or even death.


Here’s another great example of citizen science and social computing from Zooniverse.
... this project is an amazing demonstration of the power of crowdsourced science. Without the general public, this project would have taken nearly four years.

Crowdsourced Science Finds Four Candidates For Planet Nine

...recently there has been indirect evidence of a Uranus-sized planet lurking on the outer edge of our solar system, one that just might be lurking in the astronomical data we've gathered.

The evidence for a ninth planet comes from the orbits of the six most distant known objects in the solar system. These are icy, Pluto-like objects well beyond the orbit of Neptune. If they were truly at the edge of our solar system, one would expect their orbits to be rather randomly distributed. But instead their orbits are bunched together a bit, which implies they interact with another large body. Based on orbital statistics, the estimated mass of this hypothetical body is about 10 times that of Earth, which would make it a bit smaller than Uranus.

The problem is that identifying an object like planet nine is hard. It would be a faint object, and its motion across the sky would be rather small. So Zooniverse started a project where the general public could help categorize millions of objects in the in the Sky Survey data. Over just 3 days, more than 60,000 people identified about 5 million objects. Out of the data they found four objects that weren't previously known and could satisfy the conditions for planet nine. The next step is to make new observations of these objects to see if they are asteroids, dwarf planets, or the hypothetical planet nine.


The examples of crowdsourcing are numerous - but not all of them are positive - this is a scary signal of new boundaries of influence. This is worth the read.

Sockpuppets, Secessionists, and Breitbart

How Russia May Have Orchestrated a Massive Social Media Influence Campaign
Two types of accounts attempted to manipulate the conversation in conservative online spaces throughout the 2016 election: automated accounts, also known as “bots,” and human-operated fake accounts, also known as “personas” or “sockpuppets.”

While the FBI looks into the Trump campaign’s ties to Russia and the role of far-right outlets in manipulating the media ecosystem, the Senate Intelligence Committee investigates Russia’s use of paid trolls and bots, and former high-ranking Trump administration officials offer testimony in exchange for immunity, new evidence points to a highly orchestrated, large-scale influence campaign that infiltrated Twitter, Facebook, and the comments section of Breitbart during the run up to the 2016 election. Tens of thousands of bots and hundreds of human-operated, fake accounts acted in concert to push a pro-Trump, nativist agenda across all three platforms in the spring of 2016. Many of these accounts have since been refocused to support US secessionist movements and far-right candidates in upcoming European election, all of which have strong ties to Moscow and suggest a coordinated Russian campaign.

The difference between how a word is used in a given online community, compared with how it’s used in mainstream language, is that word’s novelty. Novel words in any community are usually distinct, but in the spring of 2016, the most novel words in four major online communities started to overlap. Instead of many of thousands of unique, individual voices, it was as if one voice became dominant.

...tactics and technologies make it incredibly difficult for social media platforms, internet service providers, and law enforcement to connect sockpuppet accounts to their human operators. So difficult, in fact, that reportedly no agency within the US government has the technical capability and accompanying authority to detect or defend against these types of influence operations. According to analysts close to the problem inside government, “Nobody knows, and help is not on the way.”


This is a longish article discussing the emerging changes in energy paradigm.

Nanogrids, Microgrids, and Big Data: The Future of the Power Grid

Distributed generation and automated transactions will change how we produce and consume electricity
...a lot of progress is about to happen in an unexpected spot: the electricity sector. The power grid’s interlocking technological, economic, and regulatory underpinnings were established about a century ago and have undergone only minimal disruption in the decades since. But now the industry is facing massive change.

For about a century, affordable electrification has been based on economies of scale, with large generating plants producing hundreds or thousands of megawatts of power, which is sent to distant users through a transmission and distribution grid. Today, many developments are complicating that simple model.

At the top of the list is the availability of low-cost natural gas and solar power. Generators based on these resources can be built much closer to customers. So we are now in the early stages of an expansion of distributed generation, which is already lessening the need for costly long-distance transmission. That, in turn, is making those new sources cost competitive with giant legacy power plants.

Distributed generation has long been technically possible. What’s new now is that we are nearing a tipping point, beyond which, for many applications, distributed generation will be the least costly way to provide electricity.

Although my main goal is to describe a hopeful vision that many of us in the utility business have for the electric grid, I would be remiss if I did not point out some of the challenges. These include financial ones, regulatory ones, and technical ones. And they come in all shapes and sizes.


This is cool - although it would be better if we didn’t need it.

Oleo sponge invented at Argonne National Laboratory can sop up oil in a spill

Oil can even be wrung out of the sponge and reclaimed.
A group of researchers at the Argonne National Laboratory have developed a sponge that will collect oil from bodies of water, which could improve how harbors and ports are cleaned, as well as how oil spills are managed.

The Oleo Sponge is made of a polyurethane foam whose interior surfaces are covered with oleophilic molecules that draw oil out of water. The challenge, according to Argonne, was finding a way to "glue" those oil-loving molecules to the sponge’s interior. That issue was tackled with the help of 2011 research from Argonne scientists, who were able to infuse metal oxide with nanostructures. The Oleo creators used that technique to develop a primer for the interior of the sponge that the oleophilic molecules stick to. The result is a sponge that can adsorb up to 90 times its weight in oil.

After use, the sponge can be wrung out and the oil can even be reclaimed in some cases. Argonne says it’s actively looking to commercialize the material through licensing or collaboration agreements, and the sponge could be ready for real-world use in less than five years, according to the Wall Street Journal.


An interesting development in Archeology-Anthropology - about early collective versus autocratic societies - some interesting question about hierarchy.
"I think it's a breakthrough," agrees Michael E. Smith, an archaeologist at Arizona State University (ASU) in Tempe. "I've called it the most important work in the archaeology of political organization in the last 20 years." He and others are working to extend Blanton's ideas into a testable method, hoping to identify collective states solely through the objects they left behind.

It wasn't just Greece: Archaeologists find early democratic societies in the Americas

Fargher has led surveys and excavations here since 2007, studying the urban plan and material culture of a type of society many archaeologists once believed they'd never find in Mesoamerica: a republic. "Twenty or 25 years ago, no one would have accepted it was organized this way," says Fargher, who works at the research institute Cinvestav in Mérida, Mexico.

Now, thanks in part to work led by Fargher's mentor Richard Blanton, an anthropologist at Purdue University in West Lafayette, Indiana, Tlaxcallan is one of several premodern societies around the world that archaeologists believe were organized collectively, where rulers shared power and commoners had a say in the government that presided over their lives.

These societies were not necessarily full democracies in which citizens cast votes, but they were radically different from the autocratic, inherited rule found—or assumed—in most early societies. Building on Blanton's originally theoretical ideas, archaeologists now say these "collective societies" left telltale traces in their material culture, such as repetitive architecture, an emphasis on public space over palaces, reliance on local production over exotic trade goods, and a narrowing of wealth gaps between elites and commoners.


This is an ‘inevitable’ development - in conflicts and perhaps new forms of Improvised Explosive Devices - if not explosive than sabotage and surveillance.

Could robotics enable a complete or near complete takeover with minimal response ?

Most militaries continue to look at warbots as support weapons that can conduct reconnaissance, selective strike, and logistical or other supporting tasks.

However, robots can be faster and cheaper than humans. Robots could enable entirely new attack and defense capabilities by fully utilizing robotic, sensors, AI and other capabilities.
* use massively superior mobility and heavily concentrated forces to breakthrough the line of battle and collapsing their front into several pockets of resistances.
* envelop larger, less mobile infantry forces and isolate them from each other
* The concept of bypassing enemy strong point to attack the weaker spot and left the those strong point enveloped was used perhaps for as early as the Mongol Invasion of Europe

Currently the concepts are to use robots for surveillance or as extra weapons for planes and soldiers.

A revolution would be where robotics were able to perform superior infiltration and sabotage or misdirection. This would not just be in the small scale but enabling a complete takeover without a response from the opponent.


Here’s another signal about the emergence of drones in everyday life. There’s a 1 min video.

The Swiss postal service is using autonomous drones to fly lab samples between two hospitals

An early example of drone delivery in a populated urban area.
The Swiss Post has started to fly drones to deliver laboratory samples between two hospitals in the city of Lugano — area population 150,000 — near the Italian border.

The national mail service partnered with California-based drone company Matternet to supply the drones and flight system, and has been flying them since the middle of March.

In the past two weeks, the drones have completed around 70 successful autonomous flights. The drones flew on their own without anyone directly piloting or watching the drones in person. Matternet says it has personnel monitoring all the flights from a remote location in case something goes awry.

Matternet’s quadcopters are small, measuring only 31 inches in diameter. They can carry a payload weighing up to about four pounds and travel at a speed of about 22 miles per hour.


In thinking about the future - we can’t overlook the possibilities for very fundamental scientific discoveries - we are very far from understanding the nature of reality.

This Strange Light Particle Behaviour Challenges Our Understanding of Quantum Theory

It's even spookier than we predicted.
Scientists investigating how light particles (or photons) experience entanglement on the quantum scale have discovered something entirely unexpected, and it challenges long-held assumptions about the initial moments of what Einstein referred to as "spooky action at a distance".

When the team created entangled pairs of photons, these particles didn't originate in the same place and break away as predicted - they emerged from entirely different points in space, which means quantum theory might have to account for a whole lot more randomness than we thought.


Another milestone in the domestication of DNA and applications.

Japanese man is first to receive 'reprogrammed' stem cells from another person

World-first transplant, used to treat macular degeneration, represents a major step forward in movement to create banks of ready-made stem cells.
On 28 March, a Japanese man in his 60s became the first person to receive cells derived from induced pluripotent stem (iPS) cells donated by another person.

The surgery is expected to set the path for more applications of iPS-cell technology, which offers the versatility of embryonic stem cells without their ethical taint. Banks of iPS cells from diverse donors could make stem-cell transplants more convenient to perform, while slashing costs.

iPS cells are created by removing mature cells from an individual (for example, from their skin) and reprogramming these cells back to an embryonic state. They can then be coaxed into a type of cell useful for treating a disease.

In the latest procedure, performed on a man from the Hyogo prefecture of Japan, skin cells from an anonymous donor were reprogrammed into iPS cells and then turned into a type of retinal cell, which was in turn transplanted onto the retina of the patient, who has age-related macular degeneration. Physicians hope that the cells will stop the progression of the disease, which can lead to blindness.

No comments:

Post a Comment