Students for a Smarter Planet ..leaders with conscience
Computing
February 27th, 2015
5:43
 

Decades ago, a now-classic episode of the TV program “I Love Lucy” had a hilarious sketch involving Lucy and her best friend, Ethel, hand rolling and packaging chocolates on a conveyer belt.  This particular show is still among my favorites – not least because it involved chocolate, of which I am passionately fond!

There have been a lot of advances in assembly-line technology since then.  Here’s a look at:

The rarely-seen robots that

 

package what we eat

 

 Delta robotsThe robots actually in factories don’t have hands…they’re more unusual (SPL)

And a very interesting look at the man who ‘started it all’:  Reymond Clavel of EPFL (École polytechnique fédérale de Lausanne)

 

Robots are handling many of the tasks that used to require human dexterity.  Are you part of the advance of the machines? (Although, they probably won’t make you laugh til you cry like Lucy and Ethel did!)

Bookmark and Share
February 13th, 2015
5:16
 

As we become more and more globally connected daily, the need for clear communication grows by leaps and bounds.  Dialects, accents, ‘slang’, and just plain old grammar rules can get in the way of comprehension – and missing the point of a communique can lead to dire consequences!

At IBM, Natural Language Processing (NLP) is one of the big areas of tech getting a lot of focus.  And there’s a push to ‘internationalize’ the Watson system for global usage.  Read about one application here:

IBM Watson: Watson’s Learning Japanese

 “The fact that humans and Watson are good at different things makes them good partners. The more languages Watson understands, the greater its utility will be in various endeavors and disciplines.”

 

Don’t think this is an area that matters?  This article from the Huffington Post gives a lot of information about why and how language is central to our moving forward in all phases of business and education:

Technology to the Endangered Language Rescue?

 

Whether you spend more of your time communicating by keyboard or phone, it’s essential to be clear in the message you provide to your audience.  Language is at the core of human interaction – written or verbal; what do you have to say?

Bookmark and Share
February 11th, 2015
5:34
 

There are few who live in the developed world who don’t have a list as long as their arm of the many passwords they use at work and home to conduct their daily business and routines.  That may all change – sooner that you think…  A groundswell of support is underway for BIOMETRIC PASSWORDS!  (You can learn more about biometrics by clicking the link.)

Biometrics

photo credit: images.techhive.com

Here are a few articles that have recently hit the news in which you find supporting arguments and possible pitfalls of adopting this type of system to identify the user of a particular app or piece of equipment – - -

From the US Military: DARPA wants behavior-based biometrics to replace passwords

From the UK:  Young people ‘ready to ditch passwords for biometric security’

From PCWorld Magazine: Intel and McAfee plan to kill PC passwords with new biometric authentication

And an alternate view from ComputerWorld: Will we embrace biometrics to replace passwords?

One interesting comment from the ComputerWorld opinion piece “Privacy policies constantly change . . . and usually the user loses out. Whoever stores our biometrics had better secure it well as surely the NSA would love to hoover it all up and store it for eternity.”

Data security is changing daily; maybe any future aliens who land on our planet will no longer be seeking the person in charge or leader, but instead will be stating, “Take me to your READER”.  Of course, their retinal scan would have to be pre-programmed into the system to gain access…hmmmmm, sci-fi, anyone??

Bookmark and Share
February 7th, 2015
5:10
 

Scientists at Vanderbilt University Visual Cognitive Neuroscience Laboratory are studying the effects of electrical brain stimulation to enhance thought processes.  The technology, though still being tested for any resultant adverse health effects, is also being considered for its use in treating brain disorders.

Read the blurb and watch the video:

Could a “thinking cap” help us learn? – Science Nation Thinking cap

 

Neurological science is a growing field as we explore more and more applications dealing with cognition and computing as a combined area.  Put on your own “thinking cap” and you may come up with the next big invention!

 

Bookmark and Share

By Basant Pandey, Lumbini, Nepal

summit

 

 

 

 Alfred Spector: technology can enhance learning (Amy Sussman)

 

“This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories,” a concerned commentator once spoke of a new technology. “[People] will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.”

The commentator? Socrates, quoting an Egyptian king. And the technology? Writing.

Two thousand years later, the technology has changed but the dialogue remains the same. Facebook, smartphones, and video games are all supposedly bad for us: damaging our concentration, or leading to falling grades.

While there’s no doubt that information technology can have its downsides for our day-to-day behaviour, there is very little evidence that computers are damaging our brains – any more than writing made us more forgetful. In fact, computers might just make us a bit smarter.

This potential for technology to enhance the mind was explored by Google’s vice-president of research Alfred Spector at the World-Changing Ideas Summit in New York on 21 October. He outlined the ways that even simple apps could improve the way we think and learn. “Since I was a freshman in college, almost every piece of information technology is a million times better than when I started,” he said. “And there are reasons to believe that this will affect education.”

For some enthusiastic learners, the revolution has already started: the last few years have seen a rapid rise in apps that aim to help us effortlessly absorb new material. Duolingo – an app that teaches foreign languages through playful games – already has around 40 million users, while programs like Cerego and Memrise aim to teach more general subjects, based on a growing understanding of the way the brain learns and forgets information.

But these could just be the tip of the iceberg, said Spector – if the technology will follow three important principles. Firstly, he points to research showing that even average students can reach the top 2% of a class if they have a personal tutor that can adapt their teaching methods to the student’s style of thinking and learning. “If it were the case that technology could become custom tutors, then it’s possible to imagine enormous improvements in educational attainment,” said Spector.

In fact, Memrise and Cerego already use this strategy to a certain extent – by tracking patterns in the ways they remember and forget the information – but more sophisticated approaches may emerge with time. In other words, we may all be able to enjoy the advantages that were once only available to the richest children.

Spector also pointed out that designers have already mastered the way to create immersive, compelling environments – through video games – that could take the boredom out of studying. “We have user interfaces that are so exciting that people play video games for hours and hours a day, and they could be educated by them.” Besides adding interest, Daphne Bavelier at the University of Rochester, New York, says that by effortlessly focusing our attention, immersive environments can also improve” – the kind of memory that allows you to learn a musical instrument, or a foreign language, and which normally shuts down after childhood.

Finally, Spector says that social networks could be used to increase the interaction between students. “I learnt as much from fellow students as we do from the faculty,” he told the WCIS audience. Indeed, part of the popularity of apps like Memrise is the fact that users can share their experiences.

Spector readily admits that all of this might sound a bit pie-in-the-sky at the moment, and may only attract the more dedicated users, but the obvious enthusiasm with which people are devouring apps like Duolingo suggests there is a genuine interest. “It may be enough to get this started,” he says. He thinks much of the necessary technology is already there – it just needs to be packaged in a more attractive way.

If so, he believes it could fundamentally change our society – perhaps even abolishing the need for schools. “In the past it seemed you had to go to a school to get formal education – there was no choice but to go to isolated places to be educated, but now we don’t have to do that. We may choose to but we don’t have to.” Or perhaps universities will cut down the number of years you spend on campus – allowing you to finish your education with distance learning. And since it means that people aren’t all taught in one class – but offers a more personal approach – it may mean that future students can pick and choose the bits that are most appealing while avoiding the more tedious parts of their subject.

Despite his obvious enthusiasm, Spector recognises that there could be some unexpected downsides to this kind of effortless learning, should it ever be achieved. “How do we learn grit?” he asked the audience. “If everything is a computer game how do you learn to deal with that challenge that you spent all night trying to read Chaucer and write a summary of it?” For many, though, that is probably a small price to pay if the easier path to learning makes us all a bit smarter, and a bit more creative.

As another speaker at the conference, Alexis Ohanian – the founder of reddit – put it: “The internet has flattened the world… It allows us to learn anything that we want. A teenager with a smartphone has access to more knowledge than the president did a few decades ago.”

 

Bookmark and Share

Subscribe to this category Subscribe to Computing

 
ChatClick here to chat!+