… they were now their own unavoidable experiment, and were making themselves into many things they had never been before: augmented, multi-sexed, and most importantly, very long-lived, the oldest at that point being around two hundred years old. But not one whit wiser, or even more intelligent. Sad but true: individual intelligence probably peaked in the Upper Paleolithic, and we have been self-domesticated creatures ever since, dogs when we had been wolves.
2312, Kim Stanley Robinson
[Artificial Intelligence] may well be the most vital of all commodities, surpassing water, food, heat and light. Without it, we will certainly not survive as a species.

One of our problems is data - masses of it. A few hundred years of scientific inquiry and the invention of the data-generating and sharing mechanism that is the internet has left reams of crucial information unused and unanalysed.

AI is not about sentient robots, but machines that mimic our organic intelligence by adapting to, as well as recognising, patterns in data. AI is about making machines understand.
Jamie Carter / Peter Cochrane, { South China Morning Post }

…I’m especially glad to hear that you like the exam format (try until you get it).

Working at Udacity is really exciting. We get to try to rethink education every day. Often it’s easy to get stuck in our preconceived ideas of what education ought to look like, and these ideas are hard to approach objectively. One of the Udacity decisions that I’m most excited about is the way we do our exams.

Education shouldn’t be a sieve. The goal isn’t to separate students into “those who get it” and “those who don’t” groups, yet this notion seems to be the commonly held one as far as assessment is concerned. What’s the point in that!? Education should be empowering. Instructors should be ashamed when they have students in the “those who don’t get it” group.

I think the fundamental problem is that people who become instructors were often in the “get it” group, and they like to attribute that to something special about themselves. It makes them feel smart. They want to believe that there really are smart people and not-smart people in the world and that they are the smart ones.

But that is complete nonsense. I’ve done a lot of tutoring, and I’ve never met anyone who actually couldn’t understand physics. I’ve just met people who haven’t been taught in a way that makes sense to them. Everyone is smart.

When you really believe that everyone is smart, assessment is no longer about identifying who has ability. It’s about confirming that everyone does. If you can’t ace the exam, it doesn’t mean you have failed. It means you haven’t yet fully succeeded. This, to me, is an infinitely better approach to learning.

Andy Brown, Instructor of Udacity’s first physics course, PH100.

{ Original Context }

••••••

I hope Andy won’t mind that I’m blogging this, as it’s amazing. Reading that just made me really, really happy.

I wish this mindset were more prevalent, and methodology like this is why I love Udacity.

caemron-deactivated20131118 asked:

Hello!! Love your blog, I'm really interested in physics, I'm 15 and I've been following khanacademy tutorials and soon hope to follow "Physics 1" on the MIT website... I know this'll sound weird but my site I suppose is "the other end of the nerd". I basically write poetry about how hopeless my personal life is, you could look at it as a "nerds have feelings too" site.. hahaha. Anyway, if there's any chance you could give me a shout out or something it would be very much appreciated. :)

Thank you.

That’s awesome! It’s good you’re starting now; seems like you’re on the right track.

As for your poetry blog, “nerds” are very creative people. You have to be, once you get past the rote memorization stuff and into problem-solving, theory, and exploration. And of course [we] have feelings! […even if many of us tend to think of those as an epiphenomenon of a physical system.]

I hope you’ll do well, and best of luck with your personal things. Good to have an outlet. If you do the work and remember to keep a balance of honesty and logic, it’ll get better after 15.

“Nell,” the Constable continued, indicating through his tone of voice that the lesson was concluding, “the difference between ignorant and educated people is that the latter know more facts. But that has nothing to do with whether they are stupid or intelligent. The difference between stupid and intelligent people—and this is true whether or not they are well-educated—is that intelligent people can handle subtlety. They are not baffled by ambiguous or even contradictory situations—in fact, they expect them and are apt to become suspicious when things seem overly straightforward.”

“Nell did not imagine that Constable Moore wanted to get into a detailed discussion of recent events, so she changed the subject. “I think I have finally worked out what you were trying to tell me, years ago, about being intelligent,” she said.

The Constable brightened all at once. “Pleased to hear it.”

The Vickys have an elaborate code of morals and conduct. It grew out of the moral squalor of an earlier generation, just as the original Victorians were preceded by the Georgians and the Regency. The old guard believe in that code because they came to it the hard way. They raise their children to believe in that code– but their children believe it for entirely different reasons.”

They believe it,” the Constable said, “because they have been indoctrinated to believe it.”

Yes. Some of them never challenge it– they grow up to be smallminded people, who can tell you what they believe but not why they believe it. Others become disillusioned by the hypocrisy of the society and rebel– as did Elizabeth Finkle-McGraw.”

Which path do you intend to take, Nell?” said the Constable, sounding very interested. “Conformity or rebellion?”

Neither one. Both ways are simple-minded– they are only for people who cannot cope with contradiction and ambiguity.”

Neal Stephenson
The Diamond Age: Or, a Young Lady’s Illustrated Primer
IBM unveils cognitive computing chips, combining digital ‘neurons’ and ‘synapses’August 18, 2011, { Kurzweil AI }

IBM researchers unveiled today a new generation of experimental computer  chips designed to emulate the brain’s abilities for perception, action  and cognition.
In a sharp departure from traditional von Neumann computing concepts  in designing and building computers, IBM’s first neurosynaptic computing  chips recreate the phenomena between spiking neurons and synapses in  biological systems, such as the brain, through advanced algorithms and  silicon circuitry.
The technology could yield many orders of  magnitude less power  consumption and space than used in today’s  computers, the researchers  say. Its first two prototype chips have  already been fabricated and are currently undergoing testing.
Called cognitive computers,  systems built with these chips won’t be programmed the same way  traditional computers are today. Rather, cognitive computers are  expected to learn through experiences, find correlations, create  hypotheses, and remember — and learn from — the outcomes, mimicking the  brains structural and synaptic plasticity.
“This is a major  initiative to move beyond the von Neumann paradigm that has been ruling  computer architecture for more than half a century,” said Dharmendra  Modha, project leader for IBM Research.

IBM unveils cognitive computing chips, combining digital ‘neurons’ and ‘synapses’
August 18, 2011, { Kurzweil AI }

IBM researchers unveiled today a new generation of experimental computer chips designed to emulate the brain’s abilities for perception, action and cognition.

In a sharp departure from traditional von Neumann computing concepts in designing and building computers, IBM’s first neurosynaptic computing chips recreate the phenomena between spiking neurons and synapses in biological systems, such as the brain, through advanced algorithms and silicon circuitry.

The technology could yield many orders of magnitude less power consumption and space than used in today’s computers, the researchers say. Its first two prototype chips have already been fabricated and are currently undergoing testing.

Called cognitive computers, systems built with these chips won’t be programmed the same way traditional computers are today. Rather, cognitive computers are expected to learn through experiences, find correlations, create hypotheses, and remember — and learn from — the outcomes, mimicking the brains structural and synaptic plasticity.

“This is a major initiative to move beyond the von Neumann paradigm that has been ruling computer architecture for more than half a century,” said Dharmendra Modha, project leader for IBM Research.

wildcat2030
We have known for a long time that neuronal circuits become established and get reinforced via experience – it’s a phenomenon known as “synaptic plasticity.” For example, this is how memories become anchored in the brain. The team working on the Blue Brain Project at EPFL, led by Professor Henry Markram, however, is offering radically new evidence that this may not be the whole story. The researchers were able to demonstrate that small clusters of pyramidal neurons in the neocortex interconnect according to a set of immutable and relatively simple rules.

These clusters contain an estimated fifty neurons, on average. The scientists look at them as essential building blocks, which contain in themselves a kind of fundamental, innate knowledge – for example, representations of certain simple workings of the physical world. Acquired knowledge, such as memory, would involve combining these elementary building blocks at a higher level of the system. “This could explain why we all share similar perceptions of physical reality, while our memories reflect our individual experience”, explains Markram.

“Since John Lock, about 400 years ago, research into how the brain learns and remembers has been guided by the belief that we start from a clean slate and then print memories with each new experience. The idea that memory is like building lego with fundamental building blocks of knowledge opens up an entirely new door for research”, explains Markram.

Current technology is now allowing us to qualify the “tabula rasa” hypothesis, which argues that our brains are a “blank slate” at birth, and we only gain knowledge through experience. It’s an idea that has permeated science for centuries. There is no question that knowledge, in the sense that we typically understand it (reading and writing, recognizing our friends, learning a language), is the result of our experiences. But the EPFL team’s work demonstrates that some of our fundamental representations or basic knowledge is inscribed in our genes. This discovery redistributes the balance between innate and acquired, and represents a considerable advance in our understanding of how the brain works.

Collaborative Learning for the Digital Age

{ Collaborative Learning for the Digital Age }
By Cathy N. Davidson
August 26, 2011

We used a method that I call “collaboration by difference.” Collaboration by difference is an antidote to attention blindness. It signifies that the complex and interconnected problems of our time cannot be solved by anyone alone, and that those who think they can act in an entirely focused, solitary fashion are undoubtedly missing the main point that is right there in front of them, { thumping its chest and staring them in the face. } Collaboration by difference respects and rewards different forms and levels of expertise, perspective, culture, age, ability, and insight, treating difference not as a deficit but as a point of distinction. It always seems more cumbersome in the short run to seek out divergent and even quirky opinions, but it turns out to be efficient in the end and necessary for success if one seeks an outcome that is unexpected and sustainable. That’s what I was aiming for.

See also:

Jonah Lehrer on { The Power of Outsider Intelligence }

& A new study showing faster synchronization in { disordered networks }


Synchronisation occurs when individual elements in a complex network behave in line with each other. This applies to real-life examples such as the way neurons fire during an epileptic seizure or the phenomenon of crickets falling into step with one another.

… researchers found that the higher the disorder in the network, the faster the synchronization. They subsequently verified this observation in real-life networks including an air-transported network, a social network, and a human travel network.


This result goes against previous observations, which showed that so-called small-world networks, which consist of an intermediate structure of fully ordered and fully disordered networks, favour synchronisation.

Disordered networks synchronize faster than small-world networks
A study recently published in { European Physical Journal B }

{ Speed of complex network synchronization }
C. Grabow, S. Grosskinsky and M. Timme
Eur. Phys. J. B (2011) DOI: 10.1140/epjb/e2011-20038-9

1 { Kurzweil AI }
2 { European Physical Journal }

••••••

Can this be // has it been explored further in terms of Brownian motion & evolution of the universe, biological & otherwise? Anthropology?

Also curious since synchronization is a type of ordering… it seems intuitive that a pre-ordered (pre-connected) network would have fewer “openings” left for the possibility of the connections that lead to a synchronization than a disordered one…

Could it be related to polymathy, creativity, inspiration &
{ "the power of outsider intelligence" } ?

Isn’t synchronization sought after in particular (of particles, not as in idiosyncratic) formation of life?

{ Conway’s Game of Life }