In what ways will WUaS develop BRAINWAVE headset learning WITH best STEM CC OCW (eg MIT OCW in 7 languages and CC Yale OYC) in Wikidata for online degrees - and in all 7,943 languages too?
In envisioning this (and re this blog's previous posts about brainwave headsets), I could see learners (in a Google Glass-connected brainwave headset for example) searching through MIT OCW's 2300 courses for what specific information they would need to engage next and uniquely in a 36 course undergraduate degree program.
But the great aspect of learning via the Conference Method and wiki is that learners can interactively engage, share what they are learning as learning, and teach as learners, but particular converse for idea generation.
Will one be able to add links and articles and academic papers by drawing an icon with one's brainwave headset from one side of a screen to another, thus contributing to a learning conversation?
But the great aspect of learning via the Conference Method and wiki is that learners can interactively engage, share what they are learning as learning, and teach as learners, but particular converse for idea generation.
Will one be able to add links and articles and academic papers by drawing an icon with one's brainwave headset from one side of a screen to another, thus contributing to a learning conversation?
Will one be able to co-compose with one's mind in a virtual world like OpenSim a musical motif or a film or theatrical narrative which then another co-composing could interweave their narrative with.
So with both textual, imagistic and musical symbolizing ...
Will try to post some examples of current brain wave headsets and how they work from this blog .. :)
So with both textual, imagistic and musical symbolizing ...
Will try to post some examples of current brain wave headsets and how they work from this blog .. :)
*
...
No comments:
Post a Comment