Thursday, December 3, 2015

Friday Thinking 4 December 2015

Hello – Friday Thinking is curated on the basis of my own curiosity and offered in the spirit of sharing. Many thanks to those who enjoy this. 

In the 21st Century curiosity will SKILL the cat.


“We think of the tree of life, with genetic material passing vertically from mom and dad. But with horizontal gene transfer becoming more widely accepted and more well known, at least in certain organisms, it is beginning to change the way we think about evolution and inheritance of genetic material,” said Boothby. “Instead of thinking of the tree of life, we can think about the web of life and genetic material crossing from branch to branch ... it’s exciting. We are beginning to adjust our understanding of how evolution works.”
The tardigrade genome has been sequenced, and it has the most foreign DNA of any animal

We live in a world that is transforming at an unprecedented speed, a world that is constantly challenging and disrupting the old ways we are used to do things. Given the context, I believe that if politics wants to remain relevant and be useful to citizens, it needs to change its approach. It needs to experiment with new ways and new solutions. This is what we are doing at the ministry and it's quite ground breaking. A lot of colleagues from other countries have expressed interest in my work and I hope a similar institution will soon be developed in other parts of the world.
Sweden's Minister of the Future Explains How to Make Politicians Think Long-Term

“Presence is far more intricate and rewarding an art than productivity. Ours is a culture that measures our worth as human beings by our efficiency, our earnings, our ability to perform this or that. The cult of productivity has its place, but worshipping at its altar daily robs us of the very capacity for joy and wonder that makes life worth living—for, as Annie Dillard memorably put it, “how we spend our days is, of course, how we spend our lives.”


When we care about productivity, we’re always aiming at some future goal, and judging our days by what we’ve managed to produce. Presence, on the other hand, means focusing on the present moment without aiming at anything, and judging our days more in terms of our internal experiences. If we focus too much on productivity at the expense of presence, we might find our lives slipping away in a blur.


I think it’s possible to get a good balance of productivity and presence, but it requires some careful thought about what productivity really means.


“Productivity” has become such a buzzword that it can seem like it’s the goal in itself. But productivity is useless if what you’re producing isn’t meaningful or helpful to you or others in some way. The reason we really care about productivity—or the reason we should care—is that it allows us to do the things we care about as well and effectively as possible. Productivity isn’t a goal, but rather a tool for better achieving our goals.
The cult of productivity is preventing you from being productive

The power to understand and predict the quantities of the world should not be restricted to those with a freakish knack for manipulating abstract symbols.


When most people speak of Math, what they have in mind is more its mechanism than its essence. This "Math" consists of assigning meaning to a set of symbols, blindly shuffling around these symbols according to arcane rules, and then interpreting a meaning from the shuffled result. The process is not unlike casting lots.


This mechanism of math evolved for a reason: it was the most efficient means of modeling quantitative systems given the constraints of pencil and paper. Unfortunately, most people are not comfortable with bundling up meaning into abstract symbols and making them dance. Thus, the power of math beyond arithmetic is generally reserved for a clergy of scientists and engineers (many of whom struggle with symbolic abstractions more than they'll actually admit).


We are no longer constrained by pencil and paper. The symbolic shuffle should no longer be taken for granted as the fundamental mechanism for understanding quantity and change. Math needs a new interface.
Bret Victor - Kill Math Project
http://worrydream.com/KillMath/


“We have these things called computers, and we’re basically just using them as really fast paper emulators,” he says. “With the invention of the printing press, we invented a form of knowledge work which meant sitting at a desk, staring at little tiny rectangles and moving your hand a little bit. It used to be those tiny rectangles were papers or books and you’re moving your hand with a pen.


Now we’re staring at computer screens and moving our hands on a keyboard, but it’s basically the same thing. We’re computer users thinking paper thoughts.”


“The example I like to give is back in the days of Roman numerals, basic multiplication was considered this incredibly technical concept that only official mathematicians could handle,” he continues. “But then once Arabic numerals came around, you could actually do arithmetic on paper, and we found that 7-year-olds can understand multiplication. It’s not that multiplication itself was difficult. It was just that the representation of numbers — the interface — was wrong.”
The Utopian UI Architect

There are lots of other factors that maintain the gap between the theoretical potential of our computers and our everyday use of the — technological acceleration, constant change, the sheer expense of writing software. But it all boils down to mindset in the end. Although it looks like ones and zeros, software “architects” are swinging around budgets you could use to build a skyscraper, and changing something late into a project like that has similar costs to tearing down a half-made building. Rows upon rows upon rows of expensive engineers throwing away months (or years) of work: the software freezes in place, and the world moves on. Everything is always slightly broken.


Over and over again, we go back to paper and metaphors from the age of paper because we cannot get the software right, and the core to that problem is that we managed to network the computers in the 1990s, but we never did figure out how to really network the databases and get them all working together.
Programmable blockchains in context: Ethereum’s Future

How should the IT industry prepare for the post-app era?
Realise the pivotal role that artificial intelligence will play in how we will all navigate information is what the IT industry should do – and quickly. Much of that is about embracing new IT skills. "Finding new types of skills to employ will be crucial," says Trainor. "People who understand cognitive computing and complex data structuring are important." He also suggests bringing more psychologists in-house to help create effective personalities for VPAs.


"Open source projects and investing in skills is the answer," says Mould. "The biggest software giants have all learnt this lesson and, as a result, their products and services serve a much wider ecosystem."
Goodbye apps, hello smart agents: Are you ready for the post-app world?

I love Bret Victor’s vision of the importance of design for the digital environment - a fundamental need to re-imagine the media to enable new ways of thinking. This is a MUST READ.
The Utopian UI Architect
An ex-Apple interface designer’s 40-year plan to redesign not just the way we use computers, but the way we think with them
When Bret Victor came in for his first day of work as a “Human Interface Inventor” at Apple, he found an iPad sitting on his desk. It was August of 2007. The original iPhone was only a couple months old. iPods still generated a third of the company’s revenue. The App Store didn’t exist.


“I said, ‘What is this?’ and they said, ‘We don’t know yet,’” Victor recalls. “It was an early prototype of the hardware. Maybe twenty other people in the world knew about it. Nobody had answers. My job was to try to figure it out.”


Victor spent the next two months making “an app every week, explor[ing] new UI ideas” as part of a three-person “internal R&D prototyping group” that Apple had assembled. Soon, Victor says, the iPad became “a real project” and his group was reassigned to tinker with other experimental hardware. “Everyone [at Apple] was focused on the next product release, what needed to happen six months from now,” he says. “We were the ones looking at what could happen five or ten years from now.”


Seven years later, in 2014, Bret Victor is still looking at “what could happen” to human-computer interaction in the near future. He’s still part of a close-knit internal prototyping lab (the Communications Design Group) bankrolled by a huge tech corporation (SAP). But instead of designing interfaces and exploring use cases for tomorrow’s glass-screened gadgets, Victor’s “forty-years-out vision” concerns nothing less than redesigning computing itself — not as a product or service, but “as a medium of thought.”


In other words, Victor practices what he preaches: he doesn’t use computers to build better mousetraps, but to explore and communicate ideas in a way that uniquely exploits the properties and possibilities of a programmable, dynamic, interactive medium.

This is a longish but excellent for anyone interested in understanding the emerging technology of the blockchain and some insight into the current developments involving Ethereum and the potential for smart contracts and distributed autonomous organizations - in a more historical context - this is a great place to start. In fact I think this is a MUST READ because the blockchain is much less about ‘currency’ than it is a revolution in the database paradigm - it’s a 21st century database technology.
Programmable blockchains in context: Ethereum’s Future
Ethereum brings up strong emotions. Some have compared it to SkyNet, the distributed artificial intelligence of the Terminator movies. Others once suggested the entire thing is a pipe dream. The network has been up for a few months now, and is showing no signs of hostile self-awareness — or total collapse.


But, in truth, it’s not that difficult to understand Ethereum, blockchains, Bitcoin and all the rest — at least the implications for people just going about their daily business, living their lives. Even a programmer who wants a clear picture can get a good enough model of how it all fits together fairly easily. Blockchain explainers usually focus on some very clever low-level details like mining, but that stuff really doesn’t help people (other than implementers) understand what is going on. Rather, let’s look at how the blockchains fit into the more general story about how computers impact society


The actual blockchain story starts in the 1970s when the database as we currently know it was created: the relational model, SQL, big racks of spinning tape drives, all of that stuff. If you’re imagining big white rooms with expensive beige monoliths watched over by men in ties, you’re in the right corner of history. In the age of Big Iron, big organizations paid big bucks to IBM and the rest for big databases and put all their most precious data assets in these systems: their institutional memory and customer relationships. The SQL language which powers the vast majority of the content management systems which run the web was originally a command language for tape drives. Fixed field lengths — a bit like the 140 character limit on tweets — originally served to let impatient programs fast forward tapes a known distance at super high speed to put the tape head exactly where the next record would begin. This was all going on round about the time I was born — it’s history, but it’s not yet ancient history.


Bitcoin took the paper out of that system, and replaced it with a stable agreement (“consensus”) between all the computers in the bitcoin network about the current value of all the accounts involved in a transaction. It did this with a genuinely protocol-style solution: there’s no middleman extracting rents, and no exponential system complexity from a myriad of different connectors. The blockchain architecture is essentially a protocol which works as well as hub-and-spoke for getting things done, but without the liability of a trusted third party in the center which might choose to extract economic rents. This is really a good, good thing. The system has some magic properties — same agreed data on all nodes, eventually — which go beyond paper and beyond databases. We call it “distributed consensus” but that’s just a fancy way of saying that everybody agrees, in the end, about what truth (in your bank balance, in your contracts) is.


In essence, Ethereum simulates a perfect machine — a thing which could never exist in nature because of the laws of physics, but which can be simulated by a large enough computer network. The network’s size isn’t there to produce the fastest possible computer (although that may come later with blockchain scaling) but to produce a universal computer which is accessible from anywhere by anybody, and (critically!) which always gives the same results to everybody. It’s a global resource which stores answers and cannot be subverted, denied or censored


This is kind of a big deal.
In fact, it breaks with 40 years of experience of connecting computers together to do things. As a fundamental technique, blockchains are new. And in this branch of technology, genuinely new ideas move billions of dollars and set the direction of industries for decades. They are rare.

Here’s something fascinating that points to the future of shopping - whether it’s in a store, mall or at home - a way to view and learn about an item regardless of current stock. It’s early days yet for this technology - but think about the next decade - how will this transform the shopping mall and experiences of shopping? I’m imagining that a person can have their body-scan avatar don anyform of clothing and then watch themselves from all angles to see how they look and fit.
Microsoft and Volvo's new HoloLens showroom is fascinating and frustrating
You could buy a car in mixed reality, but should you?
Cars have a long history with augmented and virtual reality. Designers rely on immersive systems, from CAVE rooms to augmented reality headsets, to visualize their work. Drivers have been using heads-up displays for decades, even if they’re projected onto a windshield and not a pair of glasses.


Bringing Microsoft’s HoloLens headset to the auto industry, though, feels much bigger. Unlike more specialized augmented reality tools, it’s something that Microsoft eventually hopes ordinary people will buy and use. And the quality of its images is nearly unprecedented; you can almost suspend disbelief and imagine the objects it projects are real. That’s what makes the company’s latest partnership so potentially exciting — and, at the same time, so frustrating.

Let’s add the power of search and more AI and this is what’s coming in the next couple of years.
Goodbye apps, hello smart agents: Are you ready for the post-app world?
Recent research shows that apps are on the wane
Gartner predicts that by year-end 2016, more complex purchasing decisions – such as where parents buy back-to-school equipment – will be made autonomously by digital assistants. That kind of spending will soon reach $2 billion (around £1.3 billion, AU$2.8 billion) annually.


"That translates to roughly 2.5% of mobile users trusting assistants with $50 a year," says Armstrong, who thinks it's significant, and that businesses will begin to partner with those in the supply chain to collaborate and deliver ultra-personalised, real-time offerings from which the digital assistant can then consciously choose between. "This may mean that businesses build their own digital assistants and platform offerings or chose to integrate their data with existing or new artificial intelligence tools," says Armstrong.


"Over the next 18 months we'll see single point of contact digital assistants used to triage people through to the correct answer to their problem," says Trainor, who thinks it's akin to every customer of a major bank having a personal banking assistant in their pocket.


"Instead of hunting, you just ask, and your next best action – a piece of content, a balance transfer, a payment to a provider, a phone number to call – is given straight to you," he says. "That's the next generation of concierge banking, and it's going to be facilitated by artificial intelligence-driven personal agents and virtual personal assistants."

This is a fascinating article in the truth is stranger than fiction category - the pictures and visuals are worth the view if only for entertainment - this little creature looks like 1/6th of it DNA is ‘alien’.
The tardigrade genome has been sequenced, and it has the most foreign DNA of any animal
Scientists have sequenced the entire genome of the tardigrade, AKA the water bear, for the first time. And it turns out that this weird little creature has the most foreign genes of any animal studied so far – or to put it another way, roughly one-sixth of the tardigrade's genome was stolen from other species. We have to admit, we're kinda not surprised.


A little background here for those who aren’t familiar with the strangeness that is the tardigrade – the microscopic water creature grows to just over 1 mm on average, and is the only animal that can survive in the harsh environment of space. It can also withstand temperatures from just above absolute zero to well above the boiling point of water, can cope with ridiculous amounts of pressure and radiation, and can live for more than 10 years without food or water. Basically, it's nearly impossible to kill, and now scientists have shown that its DNA is just as bizarre as it is.


So what's foreign DNA and why does it matter that tardigrades have so much of it? The term refers to genes that have come from another organism via a process known as horizontal gene transfer, as opposed to being passed down through traditional reproduction.


Horizontal gene transfer occurs in humans and other animals occasionally, usually as a result of gene swapping with viruses, but to put it into perspective, most animals have less than 1 percent of their genome made up of foreign DNA. Before this, the rotifer – another microscopic water creature – was believed to have the most foreign genes of any animal, with 8 or 9 percent.
But the new research has shown that approximately 6,000 of the tardigrade’s genes come from foreign species, which equates to around 17.5 percent.


“We had no idea that an animal genome could be composed of so much foreign DNA,” said study co-author Bob Goldstein, from the University of North Carolina at Chapel Hill. “We knew many animals acquire foreign genes, but we had no idea that it happens to this degree.”

The fascinating progress that being made in our understanding the DNA makes it hard to predict where we will be in the next few decades. Here’s something that may confound older studies looking to understand the link between genetics and behavior.
Ghosts in the Genome
How one generation’s experience can affect the next
In one of the 20th century’s most disastrous collisions of political ideology and science, the Russian botanist Trofim Lysenko steered the USSR’s agricultural research policies to deemphasize the deterministic concepts of Mendelian inheritance. Instead, Lysenko was committed to the idea that, within the space of a single generation, the environment could alter the phenotype of future generations, an idea that is now often (imprecisely) referred to as “Lamarckian” inheritance. In Lysenko’s view, Mendelian inheritance, along with Darwinian evolution, emphasizes competition, whereas he believed that biology was based on cooperation, and that hard work in one generation should rapidly lead to the betterment of the species.


Lysenko was among the most infamous purveyors of the idea that the environment experienced by an organism could influence the phenotype in future generations, and he was rightly denounced as a charlatan because he falsified results in pursuit of his goal. However, the scientific community has discovered over the past few decades that the idea that acquired characters can be inherited may not be completely off the mark. It turns out that epigenetic marks, information not encoded in the genome’s sequence, do respond to environmental conditions within an organism’s lifetime, and recent evidence suggests that such information may be inherited.


These findings have helped motivate modern research into the oft-discredited study of transgenerational effects of the environment. Researchers are now beginning to understand the mechanisms of epigenetic inheritance and to generate evidence for the idea that the experiences of an ancestral population can influence future generations.

Here’s another example of the looming transformation in how science and social science will do research.
The Latest Breakthrough In Understanding Diabetes Was Made By An Algorithm
Researchers now believe there are three different kinds of type 2 diabetes—a result discovered with help from machines combing through reams of medical data.
With the cost to sequence a human genome dropping by the day and medical records finally going digital, public health experts are excited for a new era of personalized, or "precision," medicine—a big data future in which there is no "average" patient, only individual patients with unique genes, environments, and lifestyles. As a measure of this excitement, this year, President Obama launched a $215 million initiative that will create a health database from 1 million volunteers that is unprecedented in detail. Breakthroughs in prevention, understanding, and treatment of disease are hoped.


Though there’s both hype and a lot of genuine promise, the field of precision medicine is still in its nascency. Genetic sequencing has helped in the diagnosis and treatment of rare genetic disease and is beginning to be important in the treatment of some cancers, such as lung cancer or brain tumors.


Now a recent study, published in the journal Science Translational Medicine, demonstrates the broader promise of precision medicine beyond genome sequencing—and in understanding an extremely common disease: type 2 diabetes.
Almost 1 in 10 Americans have type 2 diabetes, and many more are at risk. Yet it’s a poorly understood disease: Its causes, symptoms, and complications are diverse and hard for doctors to predict. By mining a database of clinical and genetic data from more than 2,500 diabetes patients, researchers Icahn School of Medicine at Mount Sinai Medical Center have now actually identified some patterns that an entire field of doctors have not: They found there are actually three distinct sub-types of type 2, each of which have very different health implications.


"This is the first tangible demonstration of precision medicine that could be applied to a more common, complex disease," says study author Joel Dudley, director of biomedical informatics at Mount Sinai (and one of Fast Company’s Most Creative People of 2014), a major hospital in upper Manhattan.


They were able to do this with access to a still relatively rare collection of thousands of Mount Sinai patients who volunteered to give their health charts and genetic data to the hospital for researcher efforts (See "In The Hospital Of The Future, Big Data Is Your Doctor"). Usually, doctors just look at a few blood tests—such as blood sugar and insulin levels—when monitoring diabetes patients. Instead, the researchers used computer modeling to map how similar each patient was to each other (a "patient-patient similarity network"), based on every piece of health data—height, weight, blood platelet counts, and hundreds of data points that human doctors alone could never process. The result was the map, seen above, that shows Mount Sinai patients map into three distinctive clusters, or "sub-types."

This is something that seems should have been discovered long ago.
We Just Discovered 6 New Kinds of Brain Cells
The map of the human brain gets a little more complete.
Thanks to a handful of newly discovered neurons, the brain just became a little bit less mysterious.


Today a team of neuroscientists led by Xiaolong Jiang and Andreas Tolias at Baylor College of Medicine in Houston announced six altogether new types of brain cells. The neuroscientists came across these new neurons while conducting a census of brain cells in adult mice in a part of the brain called the the primary visual cortex, an area chiefly concerned with  sight. The researchers credit their new insight to a recently developed method of slicing razor-thin slices of mature brain. The discovery is reported today in the journal Science.


"Just asking 'what types of cells make up the brain' is such a basic question... that establishing a complete census of all neuron cell types is of great importance in moving the field of neuroscience forward," says Tolias, at Baylor College of Medicine.​
Most previous studies investigating the odd menagerie of brain cells have used juvenile mice, mostly because it's easier to get high-resolution pictures of their brains. But there's a problem: Brains keep maturing and complicating as they get older, and Jiang's team believes that their new-found neurons might not form until adulthood.

Even greater advances in our knowledge of living systems may be arriving even faster.
Biomedical imaging at one-thousandth the cost
Mathematical modeling enables $100 depth sensor to approximate the measurements of a $100,000 piece of lab equipment
MIT researchers have developed a biomedical imaging system that could ultimately replace a $100,000 piece of a lab equipment with components that cost just hundreds of dollars.


The system uses a technique called fluorescence lifetime imaging, which has applications in DNA sequencing and cancer diagnosis, among other things. So the new work could have implications for both biological research and clinical practice.


“The theme of our work is to take the electronic and optical precision of this big expensive microscope and replace it with sophistication in mathematical modeling,” says Ayush Bhandari, a graduate student at the MIT Media Lab and one of the system’s developers. “We show that you can use something in consumer imaging, like the Microsoft Kinect, to do bioimaging in much the same way that the microscope is doing.”


The MIT researchers reported the new work in the Nov. 20 issue of the journal Optica. Bhandari is the first author on the paper, and he’s joined by associate professor of media arts and sciences Ramesh Raskar and Christopher Barsi, a former research scientist in Raskar’s group who now teaches physics at the Commonwealth School in Boston.

This is an interesting project and 7 min video - reconstructing past memories for a real experience through mixed realities.
Hyundai : Going Home
2015, 65-year Division of Korea
Here is an old man who has been longing for home in North Korea.
For Kim Gu-Hyeon and other displaced persons pining for home beyond barbed wire fence, Hyundai Motor started “Going Home” Project.


Using a combination of automotive design technology and Vworld (Spatial Information Open Platform), Mr.Kim’s hometown was virtually recreated including wildflowers and streets where he used to walk every day.


On the day the two Koreas reunite as one nation, Mr. Kim’s journey back home will be accompanied by Hyundai Motor Group.

This is wonderful article and short 4 min video from Cyborg Nation (something worth subscribing too). This is also about open source ‘hacking’ including creating computer-brain interfaces. - This is where our schools and libraries should be going.
Watch These Guys Make a Shark Swim With Their Minds
In this week’s episode of Cyborg Nation, five guys make a shark swim through the air—with their minds. Every time you raise an eyebrow or think a thought, electrical signals zoom through your brain. Electrodes on your head can pick those signals up and transmit them to a computer. And if the computer is paired with a shark, you can totally make that thing shake its tail fin.


OpenBCI, a company based in Brooklyn, is dedicated to making these so-called brain computer interfaces (BCIs) available to anyone who wants to take a crack at controlling machines with their minds. The set-up is pretty simple: You connect electrodes to a small, battery-powered circuit board, which records your body’s electrical signals and sends them to the program running on the computer.


At their first hackathon, OpenBCI plays with a few fun interfaces. In one, you can make a robotic arm move by flexing your own arm. Another shows three people working together to make three robot spiders skulk along a table. And in a third, five guys independently think of five separate commands—dive, swim, climb, left, right—to control an inflatable shark moving through the air.


On their website, OpenBCI has made the hardware design and interface software open source, and with a bit of handiness and access to a 3-D printer, you could be controlling—and tinkering with—robot arms from the comfort of your own home.
“Innovation happens faster when software and hardware are open source, when people can change and modify it to their desire,” says Joel Murphy, co-founder of OpenBCI. The hope is that anyone can get in on the BCI revolution. All they need is a board, a few electrodes, and the will to make a shark fly.

Coming to a human near you in the near future - digital tattoos - beyond the post-modern primitive - this is moving toward the meta-real transhuman. A 2 min video is also included.
Chaotic Moon Explores Biometric Tattoos For Medicine And The Military
The future of wearables could be inked on your skin. Chaotic Moon, a software design and development firm based in Austin, Texas, is developing a high-tech tattoo made of components and conductive paint to create circuitry to basically turn you into a cyborg…er collect health and other biometric data from your body.


Chaotic Moon’s tattoo kit is in the nascent prototype stage right now, but CEO Ben Lamm told me it will be able to collect and upload health and informational data, much like Jawbone or the Apple Watch and send it to medical staff – or maybe even the military.


“This is the new wearable,” Lamm told TechCrunch. “The future of wearables is biowearables.”


“This is not something that can be easily removed like a Fitbit. It can be underneath a flack jacket, directly on the skin to be collecting this data and being reported back,” Lamm said of military applications.


The tattoo is temporary and washes off much like a temporary fashion tattoo. According to Chaotic Moon, the tatt will have the ability to monitor body temperature and detect if someone is stressed based on sweat, heart rate and hydration level information uploaded via Bluetooth or location-based low-frequency mesh networks like those used for apps like Jott or Firechat.

This may seem like a small innovation - but it has amazing possibilities for implementing everywhere.
By infusing H2O with ultrasonic bubbles, StarStream gives tap water incredible cleaning power
What if you could wash your hands thoroughly and effectively with nothing more than cold water? What if you could clean countertops, floors, or even medical tools without using any harsh chemicals? how crazy would that be  A new device called StarStream creates a whole new kind of cleaning solution by combining ultrasound waves and bubbles with regular cold water. Using a single nozzle, StarStream can load any liquid with ultrasonic cleaning bubbles, bringing micro-scrubbing power to plain old tap water or increasing the cleaning power of detergents.


Researchers from the University of Southampton have demonstrated how a pioneering ultrasonic device can significantly improve the cleaning of medical instruments and reduce contamination and risk of infection.


StarStream, invented and patented by the University of Southampton and in commercial production by Ultrawave Ltd., makes water more efficient for cleaning by creating tiny bubbles which automatically scrub surfaces. The device supplies a gentle stream of water through a nozzle that generates ultrasound and bubbles, which dramatically improve the cleaning power of water reducing the need for additives and heating.


Using just cold water, StarStream was able to remove biological contamination, including brain tissue from surgical steel. Cleaning instruments between patients is critical to avoid transmission of agents leading to conditions such as Creutzfeldt-Jakob Disease. It was also able to remove bacterial biofilms that typically cause dental disease and was effective at removing soft tissue from bones, which is required prior to transplants to prevent rejection of the transplanted material by the recipient's immune system.

There’s been a lot of talk about the need to make mail delivery less expensive and the seemingly emphasized solution is community mailboxes - but here’s where I think the future is - eternal to-the-door mail delivery. Also a lovely and entertaining 2 min video
Amazon taps Jeremy Clarkson to show off its new 30-minutes-or-less delivery drones
Amazon wants customers to know its delivery drones aren’t just a figment of imagination, but an imminent reality. The company has released a new ad starring former Top Gear host Jeremy Clarkson teasing its forthcoming delivery system that promises to drop off your order in 30 minutes or less.


It’s testing several different prototypes, including this one pictured below that will be able to carry payloads of up to five pounds, flying under 400 feet. Amazon says it will cover a distance of up to 50 miles and use “sense and avoid” technology to prevent collisions with other aircraft and safely operate beyond the line of sight to distances of 10 miles or more.

In 1995 science fiction writer Neal Stephenson published a book called ‘Diamond Age’ (highly recommend it - it is just as relevant today) in which nano-fabrication -sort of like 3D printing with carbon atoms was prevalent. Many common items were fabricated as made out of diamond. It may just be that the idea was not so far-fetched.
New phase of carbon discovered: Making diamonds at room temperature
Scientists have discovered a new phase of solid carbon, called Q-carbon, which is distinct from the known phases of graphite and diamond. They have also developed a technique for using Q-carbon to make diamond-related structures at room temperature and at ambient atmospheric pressure in air.


"We can create diamond nanoneedles or microneedles, nanodots, or large-area diamond films, with applications for drug delivery, industrial processes and for creating high-temperature switches and power electronics," Narayan says. "These diamond objects have a single-crystalline structure, making them stronger than polycrystalline materials. And it is all done at room temperature and at ambient atmosphere -- we're basically using a laser like the ones used for laser eye surgery. So, not only does this allow us to develop new applications, but the process itself is relatively inexpensive."

And a significant advance in 3D printing.
Toshiba Machine’s New 3D Printer Can Print Iron And Steel 10 Times Faster Than Other Models
The bright minds at Toshiba have developed a new nozzle that has enabled the company to build a 3D metal printer that it says is 10 times faster than units that rely on powder bed fusion.


Toshiba will showcase the new 3D metal printer at Monozukuri Matching Japan 2015, which will be held on Dec. 2 to 4.


If current powder bed fusion printers can be thought of as inkjet printers, then Toshiba’s new machine would be likened to a laser printer. Powder bed fusion techniques employ lasers, but the trick here is in the way the “print head” works.


Toshiba’s LMD (Laser Metal Deposition) printer uses lasers to sinter – or melt just shy of the point of liquidation – tiny metal particles, modeling these into whatever shape the computer-aided design calls for.


The secret to the LMD printer’s high-speed output is the new nozzle, which was developed based on the company’s knowledge and expertise in fluid simulation technology.

This new breakthrough has some significant implications for the future of computing - although still a ways off.
Strange quantum phenomenon achieved at room temperature in semiconductor wafers
“The macroscopic world that we are used to seems very tidy, but it is completely disordered at the atomic scale. The laws of thermodynamics generally prevent us from observing quantum phenomena in macroscopic objects,” said Paul Klimov, a graduate student in the Institute for Molecular Engineering and lead author of new research on quantum entanglement. The institute is a partnership between UChicago and Argonne National Laboratory.


Previously, scientists have overcome the thermodynamic barrier and achieved macroscopic entanglement in solids and liquids by going to ultra-low temperatures (-270 degrees Celsius) and applying huge magnetic fields (1,000 times larger than that of a typical refrigerator magnet) or using chemical reactions. In the Nov. 20 issue of Science Advances, Klimov and other researchers in Prof. David Awschalom’s group at the Institute for Molecular Engineering have demonstrated that macroscopic entanglement can be generated at room temperature and in a small magnetic field.


The researchers used infrared laser light to order (preferentially align) the magnetic states of thousands of electrons and nuclei and then electromagnetic pulses, similar to those used for conventional magnetic resonance imaging (MRI), to entangle them. This procedure caused pairs of electrons and nuclei in a macroscopic 40 micrometer-cubed volume (the volume of a red blood cell) of the semiconductor SiC to become entangled.


“We know that the spin states of atomic nuclei associated with semiconductor defects have excellent quantum properties at room temperature,” said Awschalom, the Liew Family Professor in Molecular Engineering and a senior scientist at Argonne. “They are coherent, long-lived and controllable with photonics and electronics. Given these quantum ‘pieces,’ creating entangled quantum states seemed like an attainable goal.”
In addition to being of fundamental physical interest, “the ability to produce robust entangled states in an electronic-grade semiconductor at ambient conditions has important implications on future quantum devices,” Awschalom said.

The acceleration of alternate energy is also hitting the technologies of energy storage.
GM says li-ion battery cost per kWh already down to $145
EV Battery Costs May Approach Tesla's Levels By 2020
General Motors' first mass-market battery-electric vehicle will be pretty small. The good news is that the costs of the battery packs powering those EVs are getting smaller as well. It's a small world, after all.


GM recently showed a slide that said that its lithium-ion battery costs are down to about $145 per kilowatt hour, Hybrid Cars says, citing comments General Motors made at its Global Business Conference in Michigan. The company also showed that these costs may drop to $100/kWh by 2021. Heady stuff considering that Tesla Motors has said it'd hit those cost levels in 2020, and Tesla is not outsourcing the job like GM is (LG Chem makes the cells for the 2015 Spark EV and the Chevy Volts) . General Motors representatives confirmed to Autoblog that GM executive Mark Reuss presented the battery-cost estimate slide in a presentation at the conference.

For anyone who has heard of the placebo - this is an equally important complement - the Nocebo. This is a longish piece but worth the read for anyone interested in potential negative consequences of inappropriate framing, metaphor and priming.
The contagious thought that could kill you
To die, sometimes you need only believe you are ill, and as David Robson discovers, we can unwittingly ‘catch’ such fears, often with terrifying consequences.
Beware the scaremongers. Like a witchdoctor’s spell, their words might be spreading modern plagues.


We have long known that expectations of a malady can be as dangerous as a virus. In the same way that voodoo shamans could harm their victims through the power of suggestion, priming someone to think they are ill can often produce the actual symptoms of a disease. Vomiting, dizziness, headaches, and even death, could be triggered through belief alone. It’s called the “nocebo effect”.


But it is now becoming clear just how easily those dangerous beliefs can spread through gossip and hearsay – with potent effect. It may be the reason why certain houses seem cursed with illness, and why people living near wind turbines report puzzling outbreaks of dizziness, insomnia and vomiting. If you have ever felt “fluey” after a vaccination, believed your cell phone was giving you a headache, or suffered an inexplicable food allergy, you may have also fallen victim to a nocebo jinx. “The nocebo effect shows the brain’s power,” says Dimos Mitsikostas, from Athens Naval Hospital in Greece. “And we cannot fully explain it.”


Over the last 10 years, doctors have shown that this nocebo effect – Latin for “I will harm” – is very common. Reviewing the literature, Mitsikostas has so far documented strong nocebo effects in many treatments for headache, multiple sclerosis, and depression. In trials for Parkinson’s disease, as many as 65% report adverse events as a result of their placebo. “And around one out of 10 treated will drop out of a trial because of nocebo, which is pretty high,” he says.

For Fun
This is fun and funny especially for scientists and trend seekers. The graphs are totally compelling.
Spurious Correlations
Spurious Correlations was a project I put together as a fun way to look at correlations and to think about data. Empirical research is interesting, and I love to wonder about how variables work together. The charts on this site aren't meant to imply causation nor are they meant to create a distrust for research or even correlative data. Rather, I hope this project fosters interest in statistics and numerical research.


I'm not a math or statistics researcher (and there are better ways to calculate correlation than I do here), but I do have a love for science and discovery and that's all anyone should need. Presently I am working on my J.D. at Harvard Law School.

Here’s another demonstration of what’s possible in Minecraft.
Verizon Web Browser and Video Calling in Minecraft
I worked with Verizon to bring video calling, web browsing and more to Minecraft.
How they made it all work: http://verizoncraft.github.io