Professor Irene Pepperberg studies, as part of her research, the behaviour of grey parrots.
At first, this seems a little dull. After all, parrots just, well, parrot! ...Don't they? Well, maybe not. One Grey Parrot she is studying, called "Alex", is capable of handling basic mathematical operations from 1 to 8 (hmmm! Octal!) and has the ability to understand or vocalize up to 250 words.
Unlike most of those phony demonstrations of dogs counting, etc, this is done under laboratory conditions and involves some measure of deductive reasoning. Simple repetition, on a cue, would fail these kinds of tests.
The teaching method is fascinating. It involves peer-to-peer, rather than teacher-student relationships, and is NOT reward-based. There is no special merit for doing well, and no special punishment for doing badly. As such, the training method is surprisingly similar to many papers over on GNU.org.
What does this mean? After all, we're not parrots. Our brains aren't even patterned after the avian brain structure.
The first thing it means is that we can stop talking about "our dumb animal friends". This patronizing and idiotic attitude has likely been a major reason basic studies on the intelligence of animals is as primitive and unsuccessful as it has been.
The second thing is that it implies that virtually all animal behavioural research may be barking up the wrong tree. Research that assumes all responses are instinctual will miss the Alexes of the rest of the animal kingdom.
On a related note, attempts to use sign language and touch-screens, with apes and dolphins, whilst producing some information, may be missing the boat entirely, because they're all lecture-student based, not peer-to-peer.
This brings me to the next point. School, for us humans, is almost invariably lecture-student based, but if this practice is proving to be inefficient or simply stupid for other animals, maybe the educational system itself needs a major shake-up.
(Indeed, it's problems such as imposing a fixed rate of learning on all children that cause much of the violence and disruption -in- schools. The "slower" learners are likely to feel "put down", and the "faster" learners are going to be bored, frustrated, and probably drugged by the school to maintain control.)
A peer-peer system may allow more "natural" learning, which operates in a way that works with the brain, rather than against it. And, this, I feel would seriously reduce the stresses, strains and intense dislikes within the existing hodge-podge of dictatorial systems.
In the end, maybe Alex will prove smarter than us humans.
Lastly, machine intelligence. Again, we're trying to teach AI systems on a lecture-student basis. Again, Alex (and similar studies) denonstrate that this form of education doesn't work.
The obvious conclusion is that "Strong AI" will never be developed, until we change our view on what intelligence is.
Furthermore, because Alex' brain is hardly the most complex device in existance, there are good reasons to believe that the Strong AI problem may be easier to solve than commonly thought. The brain-power required for deductive, logical and lateral thinking, in real-time, may prove to be extremely small.