So happy to be back to the Digital World.
In the 21st Century curiosity will SKILL the cat.
If I want an organization to behave in a certain way, I need to design for that.
The whole culture of the place says 'we're looking for better ideas,' not 'keep your feet off the furniture.' " Every element is meant to stir innovation; the fungible wall system, for instance, shows how the space is meant to be reworked daily, if not hourly.
George Kembel - THE IDEA LAB: A LOOK AT STANFORD'S D.SCHOOL
It dawned on me a few months ago that the mechanism to create and sustain a place like Mycelia exists now with the help of blockchain technology and crypto-currencies. I am for the first time EVER, really excited and positive for the future of music and its industry; for artists old and (more importantly) new, along with the hyper enriched feedback loops that could exist with their listeners, collaborators and flag wavers. The FLOW of creativity, collaboration, storytelling and connecting on so many levels is going to change big time, save time… and just in time!
Its success will come from the adoption of millions of music lovers. A grand scale ongoing, collective project like no-other. To document, protect and share that which we love and build a place for it to grow, enabling future generations of artists to blossom as well as honouring those of the past.
Open source, a living, breathing, smart, decentralised, transparent, adaptable, useful, shining home for our love of music. A home which allows creativity to flow, connect and facilitate collaboration on so many levels, many of which just haven’t been possible. With this grand library of all music forming the basis upon which all music businesses from digital radio to tour bookings can then grow and thrive from. Empowering the artists, turning and landing the industry finally on its feet.
Inspired by the largest living organisms on earth, ancient, unseen, core to life itself, Mycelium (plural Mycelia) can stretch for miles, beneath the surface.
Each artist acting like its own Mycelium, in full animated dialogue with others on the global network.
Mycelia is huge, as it holds all music related information ever recorded anywhere ever ever ever but this organism stretches across our planet between hundreds of thousands of personal computers. It is the world’s greatest and most treasured library and it belongs to the two collective parties who solely make music complete. The music makers and their audience.
Derivatives too… remixes and reworks. These would all point back to the “genesis track,” as say “genesis blocks” do on the Blockchain; forking off and creating more branches, mapping its journey as it goes for all who are curious to see and research the life of a song. Some works may span across decades or even a whole century. Each derivative would create a new Spore and “hash,” which are documented on the Blockchain and divvy up the proceeds accordingly.
Imogen Heap's Mycelia: An Artists' Approach for a Fair Trade Music Business, Inspired by Blockchain
John Seely Brown, in a talk called Cultivating the Entrepreneurial Learner in the 21st Century makes the case that we finally have the tools and technology to help spread and scale the kind of play-driven education first popularized by Maria Montessori more than 75 years ago. He also provides a useful metaphor to help understand how those same technology changes are shrinking the shelf life of key professional skills:
We are moving away from a 20th century notion of learning as picking up a set of fixed assets to a 21st century notion of learning as constantly reinventing and augmenting your skills. In the past, your skillset was authoritative, transferred to you in delivery models — often called schooling — and had a wonderful scalable efficiency. How do we move to a model that requires participating in ever-changing flows of activities and knowledge?
But much of the language we use to describe education and training remains firmly rooted in the asset-based approach Seely Brown describes: we talk about courses, and certificates, and degrees, all with the idea that learning is something to “complete”, with a defined endpoint. (Paulo Freire calls it the banking model of education.)
By way of comparison, it would be absurd to think it was possible to “complete” things like: being fit, eating right, or being a good marriage partner or parent. The only real measure of success on those dimensions is a sustained commitment to constant improvement — wanting to be at least slightly better today than you were yesterday.
It’s time to treat learning and skill development the same way — as a habit of deliberate practice to be cultivated and sustained for a lifetime — to keep learning in order to get a little bit better than you were yesterday.
On Building a Daily Habit of Continuous Learning
There is no such thing as Science. The word ‘Science’ refers to a reified generality that together with others, like Nature and Culture, has been a constant source of false problems: are controversies in Science decided by Nature or Culture? Avoiding badly posed problems requires that we replace Science with a population of individual scientific fields, each with its own concepts, statements, significant problems, taxonomic and explanatory schemas. There are, of course, interactions between fields, and exchanges of cognitive content between them, but that does not mean that they can be fused into a totality in which everything is inextricably related. There is not even a discernible convergence towards a grand synthesis to give us hope that even if the population of fields is highly heterogeneous today, it will one day converge into a unified field. On the contrary, the historical record shows a population progressively differentiating into many subfields, by specialization or hybridization, yielding an overall divergent movement.
Manuel DeLanda - Philosophical Chemistry: Genealogy of a Scientific Field - 2015
This is a very interesting article with implications for anyone interested in decision support.
New model describes cognitive decision making as the collapse of a quantum superstate
Decision making in an enormous range of tasks involves the accumulation of evidence in support of different hypotheses. One of the enduring models of evidence accumulation is the Markov random walk (MRW) theory, which assigns a probability to each hypothesis. In an MRW model of decision making, when deciding between two hypotheses, the cumulative evidence for and against each hypothesis reaches different levels at different times, moving particle-like from state to state and only occupying a single definite evidence level at any given point.
But the Markov random walk theory, based in classical probability theory, runs into problems when confronted with the emerging research consensus that preferences and beliefs are constructed, rather than revealed by judgments and decisions. An international group of psychological researchers now suggests a new model called the quantum random walk (QRW) theory that specifically posits that preferences and beliefs are constructed rather than revealed by judgments and decisions, and they have published the results of an experiment that support this theory in the Proceedings of the National Academy of Sciences.
By contrast with MRW, the new theory assumes that evidence develops over time in a superposition state analogous to the wave-like state of a photon, and judgements and decisions are made when this indefinite superposition state "collapses" into a definite state of evidence. It's important to note that the researchers are not suggesting that the brain is a quantum computer; they specifically note that their report uses quantum dynamics only metaphorically.
Also related to cognition this is a 92 min video by George Larkoff who lays out the neural cognitive basis of metaphors - as the fundamental structure of language. Well worth the view.
George Lakoff: How Brains Think: The Embodiment Hypothesis
Keynote address recorded March 14, 2015 at the inaugural International Convention of Psychological Science in Amsterdam
On the topic of embodiment and the continued domestication of DNA - this is one of my favorite curiosities - the society that is an individual - the deeply social self.
The pronoun ‘I’ is becoming obsolete
Recent microbiological research has shown that thinking of plants and animals, including humans, as autonomous individuals is a serious over-simplification.
A series of groundbreaking studies have revealed that what we have always thought of as individuals are actually “biomolecular networks” that consist of visible hosts plus millions of invisible microbes that have a significant effect on how the host develops, the diseases it catches, how it behaves and possibly even its social interactions.
“It’s a case of the whole being greater than the sum of its parts,” said Seth Bordenstein, associate professor of biological sciences at Vanderbilt University, who has contributed to the body of scientific knowledge that is pointing to the conclusion that symbiotic microbes play a fundamental role in virtually all aspects of plant and animal biology, including the origin of new species.
In this case, the parts are the host and its genome plus the thousands of different species of bacteria living in or on the host, along with all their genomes, collectively known as the microbiome.
(The host is something like the tip of the iceberg while the bacteria are like the part of the iceberg that is underwater: Nine out of every 10 cells in plant and animal bodies are bacterial. But bacterial cells are so much smaller than host cells that they have generally gone unnoticed.)
Microbiologists have coined new terms for these collective entities – holobiont – and for their genomes – hologenome. “These terms are needed to define the assemblage of organisms that makes up the so-called individual,” said Bordenstein.
Designing for teamwork - as and when needed. An interesting 2min video of new forms of office equipment from Steelcase.
media:scape Overview
This video highlights the media:scape portfolio, including research and a demonstration of the product in use.
Here’s an interesting article discussing the consequences of technology for creating new forms of work. The graphs in the article are very compelling - well worth the view.
Technology has created more jobs than it has destroyed, says 140 years of data
Study of census results in England and Wales since 1871 finds rise of machines has been a job creator rather than making working humans obsolete
In the 1800s it was the Luddites smashing weaving machines. These days retail staff worry about automatic checkouts. Sooner or later taxi drivers will be fretting over self-driving cars.
The battle between man and machines goes back centuries. Are they taking our jobs? Or are they merely easing our workload?
A study by economists at the consultancy Deloitte seeks to shed new light on the relationship between jobs and the rise of technology by trawling through census data for England and Wales going back to 1871.
Their conclusion is unremittingly cheerful: rather than destroying jobs, technology has been a “great job-creating machine”. Findings by Deloitte such as a fourfold rise in bar staff since the 1950s or a surge in the number of hairdressers this century suggest to the authors that technology has increased spending power, therefore creating new demand and new jobs.
Their study, shortlisted for the Society of Business Economists’ Rybczynski prize, argues that the debate has been skewed towards the job-destroying effects of technological change, which are more easily observed than than its creative aspects.
Going back over past jobs figures paints a more balanced picture, say authors Ian Stewart, Debapratim De and Alex Cole.
“The dominant trend is of contracting employment in agriculture and manufacturing being more than offset by rapid growth in the caring, creative, technology and business services sectors,” they write.
“Machines will take on more repetitive and laborious tasks, but seem no closer to eliminating the need for human labour than at any time in the last 150 years.”
This is an important and fascinating Nature article - for anyone interested in improving human trial research.
Registered clinical trials make positive findings vanish
A study showing a fall in positive trial results after the roll-out of clinicaltrials.gov attracted much attention on social media.
The launch of the clinicaltrials.gov registry in 2000 seems to have had a striking impact on reported trial results, according to a PLoS ONE study1 that many researchers have been talking about online in the past week.
A 1997 US law mandated the registry’s creation, requiring researchers from 2000 to record their trial methods and outcome measures before collecting data. The study found that in a sample of 55 large trials testing heart-disease treatments, 57% of those published before 2000 reported positive effects from the treatments. But that figure plunged to just 8% in studies that were conducted after 2000. Study author Veronica Irvin, a health scientist at Oregon State University in Corvallis, says this suggests that registering clinical studies is leading to more rigorous research. Writing on his NeuroLogica Blog, neurologist Steven Novella of Yale University in New Haven, Connecticut, called the study “encouraging” but also “a bit frightening” because it casts doubt on previous positive results.
Irvin and her co-author Robert Kaplan, chief science officer at the Agency for Healthcare Research and Quality in Rockville, Maryland, focused on human randomized controlled trials that were funded by the US National Heart, Lung, and Blood Institute (NHLBI). The authors conclude that registration of trials seemed to be the dominant driver of the drastic change in study results. They found no evidence that the trend could be explained by shifting levels of industry sponsorship or by changes in trial methodologies.
From a Nobel award in 2006 to application - another horizon to domestication of DNA without genetic modification of plants
The Next Great GMO Debate
Deep inside its labs, Monsanto is learning how to modify crops by spraying them with RNA rather than tinkering with their genes.
The Colorado potato beetle is a voracious eater. The insect can chew through 10 square centimeters of leaf a day, and left unchecked it will strip a plant bare. But the beetles I was looking at were doomed. The plant they were feeding on—bright green and carefully netted in Monsanto’s labs outside St. Louis—had been doused with a spray of RNA.
The experiment took advantage of a mechanism called RNA interference. It’s a way to temporarily turn off the activity of any gene. In this case, the gene being shut down was one vital to the insect’s survival. “I am pretty sure 99 percent of them will be dead soon,” said Jodi Beattie, a Monsanto scientist who showed me her experiment.
The discovery of RNA interference earned two academics a Nobel Prize in 2006 and set off a scramble to create drugs that block disease-causing genes. Using this same technology, Monsanto now thinks it has hit on an alternative to conventional genetically modified organisms, or GMOs. It can already kill bugs by getting them to eat leaves coated with specially designed RNA. And if the company succeeds in developing sprays that penetrate plant cells, as it’s attempting to, it could block certain plant genes, too. Imagine a spray that causes tomatoes to taste better or helps plants survive a drought.
Monsanto isn’t the only one working on genetic sprays. Other large agricultural biotech companies, including Bayer and Syngenta, are also investigating the technology. The appeal is that it offers control over genes without modifying a plant’s genome—that is, without creating a GMO.
Here’s a very good 1 hr video about the interaction of biology and quantum mechanics - he explains every very well making both domains accessible. This is an important topic as it hints to the very real possibility of a breakthrough paradigm shift for how we understand living systems.
Quantum Biology - Johnjoe McFadden
Professor Johnjoe McFadden's Presentation "Does Biology Need Quantum Mechanics?" at Imperial College London's Festival of Science 2014
Johnjoe McFadden obtained his PhD at Imperial College London, then went on to work on human genetic diseases and infectious diseases at the University of Surrey. He has more recently specialised in examining the genetics of microbes such as the agents of tuberculosis and meningitis. Professor McFadden has published more than 100 articles in scientific journals on subjects as wide-ranging as bacterial genetics, tuberculosis, idiopathic diseases and computer modelling of evolution.
As well as contributing articles on a broad range of scientific topics to the Guardian newspaper, Johnjoe is the author of the highly regarded book Quantum Evolution. In this book he proposes a new model for the fundamental mechanisms of evolution, while presenting quantum mechanics in such a way as to be accessible to those without a background in Physics or Chemistry.
By 2020 5G networks should begin to roll-out - the question is not if we will need it (we will) the question is are we ready?
The Promise Of 5G
...we have moved through 3G and now 4G networks, with 5G on the horizon. 3G and 4G networks allowed us, as a society, to engage with the new world of digital information and entertainment on our own terms.
Now with the leap to 5G networks, we can start to completely reshape entire industries, and rethink how we run our cities and manage critical national infrastructures. 5G will be a far more capable network than its predecessors; it will deliver speeds of up to 10 Gigabits per second (Gbps), which is 40 times faster than the current maximum speeds achievable on 4G.
This sounds like futuristic wishful thinking, but, of course, so too did the idea of carrying around a mini supercomputer in your pocket. In fact, some companies and charitable organizations are already providing hints of what is possible.
Today, you access data on-demand as you move from point A to point B. For example, think about making a cross-country trip. You use your phone to check your flight status, download your boarding pass and confirm your hotel booking.
When you land, you use your phone to check your emails, send a quick text message home to let the family know you’ve arrived safely and summon a car from a mobile application to take you to a hotel you booked on your mobile phone via a different application. Today, when you need data service, you use your device to pull it down from the cloud via a high-speed network.
5G will turn this one-way interaction we have today with data into something new. Imagine a new network that will enable machines to communicate instantly without any human intervention, and to do things on our behalf and for our benefit without our active engagement.
The result will be a further transformation of how we live our lives, and a steep increase in machine-to-machine (M2M) communications to enable fuller, richer and more convenient lifestyles. This is the promise of 5G as it enables the Internet of Things (IoT).
From relationship of quantum mechanics with biology to the role that the Internet of Things (IoT) will play. This is a short article about a long term ecology project by Harvard that provides a wonderful view of the trajectory of the digital environment and Big Data.
Forget The Fridges. How Is IoT Saving Humanity?
Forget about Internet of Things’ poster child – the connected fridge. The importance of this emerging technology goes far beyond remembering to order milk.
From experimental forests to tracking elephant seals and smart beehives, scientists and technologists worldwide are running pioneering projects to examine climate change, save honeybees from extinction and even save humanity itself. IoT is at the forefront of many major projects taking place right now and having fundamental effect on progress.
The self-driving car is not here yet but everyone has heard of their anticipated arrival (if you haven’t heard about them - then island have you been stranded on?). This is a nice exploration of some the the implications.
IF AUTONOMOUS VEHICLES RULE THE WORLD
From horseless to driverless
Overturning industries and redefining urban life, self-driving cars promise to be as disruptive and transformative a technology as the mobile phone
SHORTLY after Thomas Müller eases his Audi A7 into the flow of highway traffic heading towards Shanghai, a message on the dashboard indicates that “piloted driving” is now available. Mr Müller, an Audi engineer, presses a button on the steering wheel and raises his hands. The car begins to drive itself, the steering wheel eerily moving on its own as the traffic creeps over a bridge towards the city centre.
This is, admittedly, a limited form of autonomy: the car stays on the same road, using cameras and a “LiDAR” scanner to follow the lane markings and maintain a constant distance from the vehicle in front. But this is how the world’s carmakers see the future of self-driving technology: as driver-assistance features that gradually trickle down from luxury vehicles to mass-market cars, just as electric windows and power steering did before them. Autonomous driving will, in this view, make motoring less stressful—drivers “arrive more relaxed”, says Mr Müller—but people will still buy and own cars just as they do today.
For a different vision of the driverless future, visit Heathrow airport outside London, and head to a “pod parking” area. Transfers between the car park and terminal are provided by driverless electric pods moving on dedicated elevated roadways. Using a touchscreen kiosk, you summon a pod and specify your destination. A pod, which can seat four people, pulls up, parks itself and opens its doors. Jump in, sit down and press the start button—the only control—and it drives you to your destination, avoiding other pods and neatly parking itself when you arrive, before heading off to pick up its next passengers.
Like riding in the autonomous Audi, travelling by pod is thrilling for the first 30 seconds—but quickly becomes mundane. The difference is that self-driving vehicles that can be summoned and dismissed at will could do more than make driving easier: they promise to overturn many industries and redefine urban life. The spread of driver-assistance technology will be gradual over the next few years, but then the emergence of fully autonomous vehicles could suddenly make existing cars look as outmoded as steam engines and landline telephones. What will the world look like if they become commonplace?
This is a recent Canadian imaging breakthrough that has lots of applications including enhancing night vision (and other difficult conditions for imaging).
ENERGY-EFFICIENT DEPTH-SENSING CAMERA GLEANS 3-D INFORMATION IN BRIGHT SUNLIGHT AND DARKNESS
CMU, Toronto Researchers Foresee Applications in Medicine, Games, Space Exploration
Depth-sensing cameras, such as Microsoft’s Kinect controller for video games, have become widely used 3-D sensors. Now, a new imaging technology invented by Carnegie Mellon University and the University of Toronto addresses a major shortcoming of these cameras: the inability to work in bright light, especially sunlight.
The key is to gather only the bits of light the camera actually needs. The researchers created a mathematical model to help program these devices so that the camera and its light source work together efficiently, eliminating extraneous light, or “noise,” that would otherwise wash out the signals needed to detect a scene’s contours.
“We have a way of choosing the light rays we want to capture and only those rays,” said Srinivasa Narasimhan, CMU associate professor of robotics. “We don’t need new image-processing algorithms and we don’t need extra processing to eliminate the noise, because we don’t collect the noise. This is all done by the sensor.”
A key to a smart world of things is of course Artificial Intelligence. This next 20 min video is a great summary of the key problem in AI as well as the best summary of the ‘Thinking Fast and Slow’ concept.
Monica Anderson: Dual Process Theory
Monica Anderson is CTO and co-founder of Sensai Corporation, founder of Syntience Inc., and originator of a theory for learning called "Artificial Intuition" that may allow us to create computer based systems that can understand the meaning of language in the form of text.
Here she discusses Dual Process Theory, The Frame Problem, and some consequences of these for AI research. Dual Process Theory is the idea that the human mind has two disparate modes of thinking - Subconscious Intuitive Understanding on one hand and Conscious Logical Reasoning on the other.
The Frame Problem is the idea that we cannot make comprehensive Models of the World because the world changes behind our backs and any Model we make is immediately obsolete. The conclusion is that AI research since the 1950s has been solving the wrong problem. She also introduces Model Free Methods as an alternative path to AI, capable of sidestepping the Frame Problem.
Here is a short 9 page accessible paper on the same topic:
Reduction Considered Harmful
And here’s Monica’s website with more resources:
Here is another great article on the topic of AI - a long interview with Stephen Wolfram.
Interview with Stephen Wolfram on AI and the future
Few people in the tech world can truly be said to “need no introduction.” Stephen Wolfram is certainly one of them.
The following interview was conducted on June 27, 2015. Although it is lengthy, weighing in at over 10,000 words, it is published here in its entirety with only very minor edits for clarity.
Here’s a short Time article on the progress of solar energy.
This Is Where The World’s First Entirely Solar-Powered Airport Has Been Unveiled
Over the next 25 years, the new power system is expected to save 300,000 tons of carbon emissions, the equivalent of planting three million trees
Cochin International Airport in the southern Indian state of Kerala became the world’s first entirely solar-powered airport on Tuesday, unveiling a new system that will make the airport “absolutely power neutral,” according to a statement released by the parent company.
The airport’s solar power plant, which is comprised of more than 46,000 solar panels arrayed across 45 acres of land, will produce 48,000 units of energy per day, the Economic Times reports.
This is far from ready for prime time - but suggests some progress toward another form of energy.
A small, modular, efficient fusion plant
New design could finally help to bring the long-sought power source closer to reality.
It’s an old joke that many fusion scientists have grown tired of hearing: Practical nuclear fusion power plants are just 30 years away — and always will be.
But now, finally, the joke may no longer be true: Advances in magnet technology have enabled researchers at MIT to propose a new design for a practical compact tokamak fusion reactor — and it’s one that might be realized in as little as a decade, they say. The era of practical fusion power, which could offer a nearly inexhaustible energy resource, may be coming near.
Using these new commercially available superconductors, rare-earth barium copper oxide (REBCO) superconducting tapes, to produce high-magnetic field coils “just ripples through the whole design,” says Dennis Whyte, a professor of Nuclear Science and Engineering and director of MIT’s Plasma Science and Fusion Center. “It changes the whole thing.”
The stronger magnetic field makes it possible to produce the required magnetic confinement of the superhot plasma — that is, the working material of a fusion reaction — but in a much smaller device than those previously envisioned. The reduction in size, in turn, makes the whole system less expensive and faster to build, and also allows for some ingenious new features in the power plant design. The proposed reactor, using a tokamak (donut-shaped) geometry that is widely studied, is described in a paper in the journal Fusion Engineering and Design, co-authored by Whyte, PhD candidate Brandon Sorbom, and 11 others at MIT. The paper started as a design class taught by Whyte and became a student-led project after the class ended.
The world’s most powerful planned fusion reactor, a huge device called ITER that is under construction in France, is expected to cost around $40 billion. Sorbom and the MIT team estimate that the new design, about half the diameter of ITER (which was designed before the new superconductors became available), would produce about the same power at a fraction of the cost and in a shorter construction time.
But despite the difference in size and magnetic field strength, the proposed reactor, called ARC, is based on “exactly the same physics” as ITER, Whyte says. “We’re not extrapolating to some brand-new regime,” he adds.
Here’s an interesting interview discussing an innovative use of the ‘Blockchain’ in the music business. A hint of the many innovations that the blockchain can enable. This is part 1.
Imogen Heap Gets Specific About Mycelia: A Fair Trade Music Business Inspired By Blockchain
Imogen Heap is fed up with hearing artists, herself included, complain about the current state of the music industry. And when the composer, performer, technologist, inventor, and the only female artist to have ever won the Grammy for engineering decides to take action, things happen.
Ms. Heap is a galvanizing presence and a catalyst for change; the rare artist who is willing to channel her dissatisfaction into something tangible, and – given her popularity – able to have people pay attention. This spirit has resulted in, among other things, pioneering the practice of self-releasing music with 2005’s Speak for Yourself, long before it became somewhat of the norm to do so; to wrapping each song around a different project for her 2014 album, Sparks; to inventing a Musical glove that she’s presented at TED and other stages (and you really must see to understand).
Via Zoe Keating, who I interviewed on the subject of Bitcoin and the Arts, I was introduced to her good friend, Ms. Heap, and recently had the opportunity to speak – via Skype (Ms. Heap lives in England) and email — with her.
After Heap’s 20 year career of wading through the dense fog of the Music industry (along with every other artist on the planet), Heap’s crystal clear belief – as is mine — is that, given the unethical foundation upon which the industry was built, and its many infamous shortcomings nothing short of a wholesale reinvention will ever lead to real change.
The blockchain model is well elaborated and is a must read in Part 2 is here:
Here’s the recent research report - highlighting the questionable moral grounds of the music industry’s treatment of artists.
BerkleeICE's Rethink Music Releases Report on Transparency and Fairness in the Music Industry
The Berklee Institute for Creative Entrepreneurship (BerkleeICE) today released an in-depth study focused on promoting fairness and transparency within the music industry.
Originating under BerkleeICE's Rethink Music initiative and entitled "Fair Music: Transparency and Money Flows in the Music Industry,” the report is the culmination of a year-long examination of the $45 billion global music business and explores the underlying challenges within the current compensation structure while proposing solutions to improve licensing, revenue transparency and cash flow for musicians.
The report was developed by Berklee College of Music faculty and students in collaboration with leading music industry organizations, companies, recording artists, and industry experts. It exposes current inefficiencies within the industry, including millions of dollars that go undistributed to rightful creators, backroom licensing deals that leave musicians out of the rights conversation entirely and overly opaque royalty statements and accounting systems that are often impossible to interpret or verify.
"As the music industry evolves and streaming services become the dominant means of listening, recording artists' and songwriters’ rights and the flow of money within the industry is the single biggest challenge today's musicians face, and with this initiative, we are addressing the issue head-on for today's creators, including Berklee students and alumni," said Allen Bargfrede, Berklee associate professor of music business, founder and executive director of Rethink Music and the project’s leader. "By highlighting recommendations—and not simply uncovering existing issues—our goal is to bring together industry stakeholders, technologists, academics, and others to push forward with crafting solutions in the near-term."
This is a very interesting 19 min video discussing to things a viral 6 sec music meme used everywhere and how copyright law can be an impediment to the spread of ideas by using this music meme know as the Amen Break as an example.
Video explains the world's most important 6-sec drum loop
This fascinating, brilliant 20-minute video narrates the history of the "Amen Break," a six-second drum sample from the b-side of a chart-topping single from 1969. This sample was used extensively in early hiphop and sample-based music, and became the basis for drum-and-bass and jungle music -- a six-second clip that spawned several entire subcultures. Nate Harrison's 2004 video is a meditation on the ownership of culture, the nature of art and creativity, and the history of a remarkable music clip.
This is a longish piece by Steven Johnson - but a great compliment to the other pieces in this line of thinking.
The Creative Apocalypse That Wasn’t
In the digital economy, it was supposed to be impossible to make money by making art. Instead, creative careers are thriving — but in complicated and unexpected ways.
On July 11, 2000, in one of the more unlikely moments in the history of the Senate Judiciary Committee, Senator Orrin Hatch handed the microphone to Metallica’s drummer, Lars Ulrich, to hear his thoughts on art in the age of digital reproduction. Ulrich’s primary concern was a new online service called Napster, which had debuted a little more than a year before. As Ulrich explained in his statement, the band began investigating Napster after unreleased versions of one of their songs began playing on radio stations around the country. They discovered that their entire catalog of music was available there for free.
Ulrich’s trip to Washington coincided with a lawsuit that Metallica had just filed against Napster — a suit that would ultimately play a role in the company’s bankruptcy filing. But in retrospect, we can also see Ulrich’s appearance as an intellectual milestone of sorts, in that he articulated a critique of the Internet-era creative economy that became increasingly commonplace over time. ‘‘We typically employ a record producer, recording engineers, programmers, assistants and, occasionally, other musicians,’’ Ulrich told the Senate committee. ‘‘We rent time for months at recording studios, which are owned by small-business men who have risked their own capital to buy, maintain and constantly upgrade very expensive equipment and facilities. Our record releases are supported by hundreds of record companies’ employees and provide programming for numerous radio and television stations. ... It’s clear, then, that if music is free for downloading, the music industry is not viable. All the jobs I just talked about will be lost, and the diverse voices of the artists will disappear.’’
But starting with Ulrich’s testimony, a new complaint has taken center stage, one that flips those older objections on their heads. The problem with the culture industry is no longer its rapacious pursuit of consumer dollars. The problem with the culture industry is that it’s not profitable enough. Thanks to its legal troubles, Napster itself ended up being much less important as a business than as an omen, a preview of coming destructions. Its short, troubled life signaled a fundamental rearrangement in the way we discover, consume and (most importantly) pay for creative work. In the 15 years since, many artists and commentators have come to believe that Ulrich’s promised apocalypse is now upon us — that the digital economy, in which information not only wants to be free but for all practical purposes is free, ultimately means that ‘‘the diverse voices of the artists will disappear,’’ because musicians and writers and filmmakers can no longer make a living.
The dystopian scenario, after all, isn’t about the death of the record business or Hollywood; it’s about the death of music or movies. As a society, what we most want to ensure is that the artists can prosper — not the record labels or studios or publishing conglomerates, but the writers, musicians, directors and actors themselves.
Their financial fate turns out to be much harder to measure, but I endeavored to try. Taking 1999 as my starting point — the year both Napster and Google took off — I plumbed as many data sources as I could to answer this one question: How is today’s creative class faring compared with its predecessor a decade and a half ago? The answer isn’t simple, and the data provides ammunition for conflicting points of view. It turns out that Ulrich was incontrovertibly correct on one point: Napster did pose a grave threat to the economic value that consumers placed on recorded music. And yet the creative apocalypse he warned of has failed to arrive. Writers, performers, directors and even musicians report their economic fortunes to be similar to those of their counterparts 15 years ago, and in many cases they have improved. Against all odds, the voices of the artists seem to be louder than ever.
Here is an interesting article looking at the varying use of expletives depending on geographics. A fascinating use of social media. This is well worth the view.
The Geography of Profanity
Like other things, our preferred swear words are regional.
“Asshole is a wonderful word,” said Mike Pesca in his podcast, the Gist. His former colleagues at NPR had wanted to call someone an asshole, and even though it was for a podcast, not broadcast, and even though the person in question was a certified asshole, the NPR censor said no. Pesca disagreed.
Pesca is from Long Island and, except for his college years in Atlanta, he has spent most of his time in the Northeast. Had he hailed from Atlanta—or Denver or Houston or even San Francisco—“asshole” might not have sprung so readily to his mind as le mot juste, even to denote Donald Trump. The choice of swear words is regional.
Linguist Jack Grieve has been analyzing tweets—billions of words—and recently he posted maps showing the relative popularity of different expletives. For example, every county in the Northeast tweets “asshole” at a rate at least two standard deviations above the national mean.
The original research results can be found here.
Research Blog
Here’s something interesting - the interaction of traditional board games with the Internet - The Virtual funding the Face-to-Face.
KICKSTARTERS OF CATAN
Crowdfunding Is Driving A $196 Million Board Game Renaissance
Albert Mach wants to help you lead a Viking clan. He wants you to compete for honor and treasure and the control of islands. He wants you to tame the wild dragon. And he wants the masses of the Internet to bankroll all of it.
That’s because he’s a board game designer. When I talked to Mach, he and his two brothers were about two weeks into a monthlong fundraising campaign to launch their first game, Vikings of Dragonia. Mach had never made a serious go at creating a game before but figured why not — we’re living in a golden age of board games, after all. The Settlers of Catan, first published in Germany in 1995, introduced many Americans to so-called Euro-style board games — games with elegant gameplay, deep strategy, compelling themes and attractive art. Since then, the quantity, quality and variety of new board and card games seem to increase every year. Attendance at gaming conventions has boomed. One of the biggest, Gen Con, set its sixth straight attendance record this year. Sales of games are swelling, too: The hobby game market had an estimated $880 million in sales in 2014 in the U.S. and Canada, up 20 percent from the year before, according to the pop-culture resource ICv2.
And now there are more games being made than ever. The crowdfunding website Kickstarter has become the go-to place to finance a passion board game project. “The barrier to entry is much lower, especially with board games,” Mach said. “All you need is a pencil and paper.”
Not from the domestication of DNA - but very happy news anyway.
Extinct tree grows anew from ancient jar of seeds unearthed by archaeologists
For thousands of years, Judean date palm trees were one of the most recognizable and welcome sights for people living in the Middle East -- widely cultivated throughout the region for their sweet fruit, and for the cool shade they offered from the blazing desert sun.
From its founding some 3,000 years ago, to the dawn of the Common Era, the trees became a staple crop in the Kingdom of Judea, even garnering several shout-outs in the Old Testament. Judean palm trees would come to serve as one of the kingdom's chief symbols of good fortune; King David named his daughter, Tamar, after the plant's name in Hebrew.
Sadly, around the year 500 AD, the once plentiful palm had been completely wiped out, driven to extinction for the sake of conquest.
During excavations at the site of Herod the Great's palace in Israel in the early 1960's, archeologists unearthed a small stockpile of seeds stowed in a clay jar dating back 2,000 years. ...in 2005, botanical researcher Elaine Solowey decided to plant one and see what, if anything, would sprout.
Amazingly, the multi-millennial seed did indeed sprout -- producing a sapling no one had seen in centuries, becoming the oldest known tree seed to germinate. Today, the living archeological treasure continues to grow and thrive; In 2011, it even produced its first flower -- a heartening sign that the ancient survivor was eager to reproduce. Meanwhile, Solowey is working to revive other age-old trees from their long dormancy.
No comments:
Post a Comment