Scott, Thanks for the update. Watson is accessed via Bluemix, so search for Bluemix and Android. http://www.ibm.com/developerworks/library/mo-android-mobiledata-app/ See this and comments: https://www.linkedin.com/groups/Access-Watson-on-BlueMix-Today-6729452.S.5930081205515993091
The examples above used two of the seven Watson APIs in BlueMix: 1. Question and Answer: Vikash Ebola example - http://ebolabigdata.org 1. User Modeling: Haan Badge example - http://challengepost.com/software/tortellini-personality-badge Some have suggested it would be a good grand challenge for the community to build a standard training set for identifying the 7000 languages around the world. ... and contribute to: 2. Language Identification -Jim
Thanks very much for your email.
To complement the other all-languages' lists, I recently came across an new list of languages, by some Swedes, I think, called Glottolog - http://glottolog.org/glottolog/language - which categorizes about 7,870 languages currently, more than the 7,106 indexed by the ISO 639 and "The Ethnologue." I added this Glottolog URL too to the main WUaS "Languages" wiki subject page at WUaS - http://worlduniversity.wikia.com/wiki/Languages#Ideas - not yet in MediaWiki/Wikidata. I also added this to the beginning WUaS LANGUAGE_TEMPLATE - http://worlduniversity.wikia.com/wiki/LANGUAGE_TEMPLATE - where someone could eventually wiki-start a new university or school at WUaS, theoretically beyond the 288 languages in Wikipedia/Wikidata, with their already existing communities of contributors and editors. I think the ISO languages' standard is the sensible way to go, however, since I think Wikipedia/Wikidata are also using this. It looks like Watson/BlueMix too uses a 5-letter ISO standard too here - http://www.ibm.com/smarterplanet/us/en/ibmwatson/developercloud/language-identification.html (http://www.ibm.com/smarterplanet/us/en/ibmwatson/developercloud/).
Not knowing Bluemix/Watson yet, I couldn't deduce from the Ebola or Haan examples, what a "standard training set" is (e.g. which could be used "for identifying the 7000 languages around the world"). I'd be interested in building in such a standard training set into best STEM OpenCourseWare-centric World University and School as WUaS moves from the Wikia wiki to MediaWiki/Wikidata/Wikibase/qLabel and anticipating Wikipedia's 288+ languages ... Would this be possible?
On 1/11/15 10:39 PM, Jim Spohrer wrote:>
Thanks for the update.
Thought you might enjoy this: http://www.valuenetworksandcollaboration.com/home/aboutthisbook.html
Why a free live digital edition?
This book is being provided as a live digital edition for two reasons: 1) It can be instantly translated into any of the 52 languages supported by Google Translate, and 2) It will allow continuous updating and additions of cases and examples provided by users of this approach. We are publishing the details of conducting a Value Network Analysis (VNA) as a non-proprietary methodology. This means it can more readily meet the open method requirements of standards bodies. We encourage adoption of the method in its full integrity - as a method and framework for business modeling.
Your ambition is of course much greater than 52 languages, and I think the world is going your way - with more language translation capabilities from many vendors.
Exciting and thank you.
Verna Allee's digital book version,
"Value Networks and the true nature of collaboration" -
(which as you wrote: (1) It can be instantly translated into any of the 52 languages supported by Google Translate, and 2) It will allow continuous updating and additions of cases and examples provided by users of this approach),
is a key coding kernel for an universal translator, (which extends networks remarkably across languages via a kind of wiki book translation approach) -
and would be invaluable to somehow build into all-languages STEM OCW-centric wiki World University and School in a variety of others ways, and with voice eventually too.
I added Allee's book to WUaS at
Glad she went to UC Berkeley and that she/they're doing this translation of her book in conjunction with Google Translate.
How to build this into Wikidata / WUaS LANGUAGE TEMPLATE / potentially Watson/BlueMix and in collaboration with Google Translate and Google, all as a CC wiki for open learning and teaching in all 7,870 languages, would be my next question?