Wikidata IRC office hour is happening now 4.9.19 at 9:30am PT https://meta.wikimedia.org/wiki/IRC_office_hours … on the #wikimedia-office channel to present the state of development, the plans for the future, & answer all of your questions. Notes of the meetings will be published https://webchat.freenode.net/?channels=#wikimedia-office … ~
Wikidata IRC office hour is happening now 4.9.19 at 9:30am PT https://meta.wikimedia.org/wiki/IRC_office_hours … on the #wikimedia-office channel to present the state of development, the plans for the future, & answer all of your questions. Notes of the meetings will be published https://webchat.freenode.net/?channels=#wikimedia-office … ~
- https://twitter.com/WorldUnivAndSch/status/1115664413703331841 -
*
[Wikidata] Next IRC office hour on April 9th
|
10:48 AM (4 minutes ago)
| |||
Hi Lydia, Lea, Kate and Katherine (and all),
Just checking in: Looks like World Univ & Sch (Scott_WUaS) is blocked from asking questions in the office hour today, and also in another recent related office hour event (with Kate Chapman) in IRC:
And WUaS isn't in the log here - https://www.wikidata.org/ wiki/Wikidata:Events/IRC_ office_hour_2019-04-09 . Thanks, Lea!
Am wondering then how best to communicate about further best co-developing our World Univ & Sch donation to Wikidata (in 2015 and the getting WUaS Miraheze MediaWiki in 2017) to make them interoperable (a long standing question)?
For example, how please could I add this WUaS "Wikidata_databases_and_ ecosystems" wiki subject as a 'front end" - https://wiki. worlduniversityandschool.org/ wiki/Wikidata_databases_and_ ecosystems (or any of the others here - https://wiki. worlduniversityandschool.org/ wiki/Subjects) - to Wikidata / Wikibase as a "back end" (paralleling a Wikipedia page as a "front end" with Wikidata as a back end: for ex. front end: https://en.wikipedia.org/ wiki/Nontheist_Quakers - and back end: https://www.wikidata.org/ wiki/Q7049628)? And how also please could WUaS do this for all 725 WUaS wiki pages currently here - https://wiki. worlduniversityandschool.org/ wiki/World_University ?
And my question in this office hour too has to do with adding languages -
- and since WUaS would like to make possible having our linguists and anthropologists, for example, add any of the remaining languages of the 7,111 living languages (https://www.ethnologue.com/) or of the 8,496 entries listed in Glottolog (https://glottolog.org/ glottolog/language) and eventually in voice - am thinking conceptually re Google Voice / Google Translate too.
How best to communicate if WUaS is blocked in IRC office hours?
Thank you so much, Scott
Nation States' Univs - https://wiki. worlduniversityandschool.org/ wiki/Nation_States (Planning major CC-4 MIT OpenCourseWare-centric universities online in each of ~200 countries' official / main languages)
All languages - https://wiki. worlduniversityandschool.org/ wiki/Languages (Planning wiki schools for open teaching and learning in each of all 7,111 living languages from here)
Just checking in: Looks like World Univ & Sch (Scott_WUaS) is blocked from asking questions in the office hour today, and also in another recent related office hour event (with Kate Chapman) in IRC:
Vigneron> (for people like me who just heard of "signed statements" where can I find some more basic explanation of what it is and how it works?)
<
superyetkin> Auregann_WMDE: thanks for the link, I will examine it
<
Scott_WUaS> whispers “hi everyone!” & "thank you!" for these great Wikidata developments
<
== Cannot send to nick/channel: #wikimedia- office
Auregann_WMDE> Vigneron: Here :) https://www.wikidata.org/ wiki/Wikidata:Development_ plan#Signed_statements
<
Vigneron> Auregann_WMDE: thanks !
<
Auregann_WMDE> A few minutes left, do you have any other questions? :)
<Am wondering then how best to communicate about further best co-developing our World Univ & Sch donation to Wikidata (in 2015 and the getting WUaS Miraheze MediaWiki in 2017) to make them interoperable (a long standing question)?
For example, how please could I add this WUaS "Wikidata_databases_and_
And my question in this office hour too has to do with adding languages -
<Lydia_WMDE> It is currently not shown on the mobile view. It is currently also not possible to add a new language that isn't already there or in your prefered languages. We want to change both things.
How best to communicate if WUaS is blocked in IRC office hours?
Thank you so much, Scott
Nation States' Univs - https://wiki.
All languages - https://wiki.
On 1:53AM, Mon, Apr 8, 2019 Léa Lacroix <lea.lacroix@wikimedia.de wrote:
______________________________Hello all,The next Wikidata IRC office hour will take place on April 9th at 16:30 UTC (18:30 in Berlin). As usual, we will meet on the #wikimedia-office channel to present the state of development, the plans for the future, and answer all of your questions.If you can't join, the notes of the meetings will be published after the meeting.Sorry for the late announcement and see you there!
--
Léa Lacroix
Project Manager Community Communication for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/029/42207._________________
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
* *
Having donated World Univ & Sch to Wikidata for co-development in 2015, WUaS seems to be "blocked" on the Wikidata email list too, with this message:
"Your mail to 'Wikidata' with the subject
Re: [Wikidata] Invitation to join Linked Data for Libraries
Wikidata Affinity Group
Is being held until the list moderator can review it for approval.
The reason it is being held:
Post to moderated list"
*
In response to:
[Wikidata] Invitation to join Linked Data for Libraries Wikidata Affinity Group
Hello all,
I'm a new Wikimedian in Residence as part of the Linked Data for Production project. One of the goals of the project is understanding how libraries can contribute to and leverage Wikidata as a platform for publishing, linking, and enriching library linked data. A number of institutions that are part of the grant are working on projects involving Wikidata and we decided to start an interest group with biweekly meetings to discuss various aspects of Wikidata in support of the projects. Possible topics include Wikidata best practices, documentation, communication channels, policies, and tools.
At each meeting, myself or a guest will present some relevant material related to the topic and we’ll discuss any issues members have encountered as well as helpful resources. At the first meeting on April 23rd, we’ll talk about the purpose and goals of the group as well as the Wikidata related projects that are part of the grant.
I'd like to invite any interested Wikidata community members to join us. The call details and communication channels are below.
First call details:
April 23, 2019 9am PST / 12noon EST / 5pm GMT / 6pm CET
Agenda: https://docs.google.com/ document/d/ 1BuszEQQxlOY14hK60Fl7n8Huvh6jE WXre0-wSvpyq84/edit?usp= sharing
Join: https://stanford.zoom.us/j/ 204437188
Communication:
Ld4-wikidata Google group: https://groups.google.com/d/ forum/ld4-wikidata
#wikidata channel on LD4 Slack: http://bit.ly/joinld4slack
Notes in public LD4 Wikidata folder: https://drive.google.com/ drive/folders/ 1JwTulCABs0TkGQDVSnYbIYEb7bC- j4-n
Website: https://wiki.duraspace.org/ display/LD4P2/Wikidata+ Affinity+Group
The Affinity Group is open to anyone interested in libraries and Wikidata, so feel free to share this invitation.
Hilary Thorsen
Wikimedian in ResidenceDigital Library Systems and ServicesStanford LibrariesLathrop LibraryStanford, CA 94305thorsenh@stanford.edu650-285-9429
I'm a new Wikimedian in Residence as part of the Linked Data for Production project. One of the goals of the project is understanding how libraries can contribute to and leverage Wikidata as a platform for publishing, linking, and enriching library linked data. A number of institutions that are part of the grant are working on projects involving Wikidata and we decided to start an interest group with biweekly meetings to discuss various aspects of Wikidata in support of the projects. Possible topics include Wikidata best practices, documentation, communication channels, policies, and tools.
At each meeting, myself or a guest will present some relevant material related to the topic and we’ll discuss any issues members have encountered as well as helpful resources. At the first meeting on April 23rd, we’ll talk about the purpose and goals of the group as well as the Wikidata related projects that are part of the grant.
I'd like to invite any interested Wikidata community members to join us. The call details and communication channels are below.
First call details:
April 23, 2019 9am PST / 12noon EST / 5pm GMT / 6pm CET
Agenda: https://docs.google.com/
Join: https://stanford.zoom.us/j/
Communication:
Ld4-wikidata Google group: https://groups.google.com/d/
#wikidata channel on LD4 Slack: http://bit.ly/joinld4slack
Notes in public LD4 Wikidata folder: https://drive.google.com/
Website: https://wiki.duraspace.org/
The Affinity Group is open to anyone interested in libraries and Wikidata, so feel free to share this invitation.
Cheers,
Hilary
Hilary Thorsen
Wikimedian in ResidenceDigital Library Systems and ServicesStanford LibrariesLathrop LibraryStanford, CA 94305thorsenh@stanford.edu650-285-9429
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/ma
*
|
6:04 PM (26 minutes ago)
| |||
Hi Hilary,
I noted this comment on your doc :
" what tools haven't been developed yet, but are also in demand by Wikimedia and library communities? so we can collaborate"
So what features or changes could be added to our OpenRefine to help with any Partner Projects? We're all ears.
*
Just emailed both:
|
6:18 PM (15 minutes ago)
| |||
Hi Hilary, and Wikidatans,
Greetings. Thanks so much for this Linked Data for Libraries Wikidata Affinity Group and call. Having donated WUaS to Wikidata in 2015 for co-development, and in developing wiki World University and School, which is like Wikipedia in ~300 languages with best STEM CC-4 OCW in 5 languages, WUaS would like to develop wiki library pages for online libraries in each of all 7,111 living languages. Here's the beginning Library Resources' wiki page in English - https://wiki. worlduniversityandschool.org/ wiki/Library_Resources - but not yet connected with Wikidata / Wikibase. WUaS is also seeking to develop online university libraries for matriculated WUaS students (as we license and accredit with the state of California's BPPE + for free-to-students' university and high school degrees) in online WUaS Universities in each of all ~200 countries' main / official languages from here - https://wiki. worlduniversityandschool.org/ wiki/Nation_States. May seek to join the call on April 23. Thank you. Friendly greetings, Scott (worlduniversityandschool.org)
*
Just added 'Stanford' and 'libraries' to the Labels' list for this blog, as a consequence of this email exchange ...
*
Will see if my email goes through eventually ...
* *
April 10, 2019
Hi Hilary, Thad, Claudio, and Wikidatans,
Thanks for your emails. Am re-sending this to you, since this message was rejected in sending it to the entire group. How best to wiki-add libraries in the remaining 6811 known living languages at World University and School is another important question (Have blogged here - http://scott-macleod. blogspot.com/2019/04/marine- iguana-galapagos-looks-like. html - about 2 related 'blocks' re Wikidata/Wikimedia communications).
Greetings. Thanks so much for this Linked Data for Libraries Wikidata Affinity Group and call. Having donated WUaS to Wikidata in 2015 for co-development, and in developing wiki World University and School, which is like Wikipedia in ~300 languages with best STEM CC-4 OCW in 5 languages, WUaS would like to develop wiki library pages for online libraries in each of all 7,111 living languages. Here's the beginning Library Resources' wiki page in English - https://wiki. worlduniversityandschool.org/ wiki/Library_Resources - but not yet connected with Wikidata / Wikibase. WUaS is also seeking to develop online university libraries for matriculated WUaS students (as we license and accredit with the state of California's BPPE + for free-to-students' university and high school degrees) in online WUaS Universities in each of all ~200 countries' main / official languages from here - https://wiki. worlduniversityandschool.org/ wiki/Nation_States. May seek to join the call on April 23. Thank you.
Friendly greetings, Scott (worlduniversityandschool.org)
Planning all 7, 11 living languages - https://wiki.worlduniversityandschool.org/wiki/Languages
Languages-World Univ: https://twitter.com/sgkmacleod
*
There are some amazing recent developments mentioned in this Wikidata office hour today:
== Scott_WUaS [d04c1d1a@gateway/web/freenode/ip.208.76.29.26] has joined #wikimedia-office
Lydia_WMDE> We've now worked on making it possible to store these on Wikidata itself. You can already test it on https://wikidata-shex.wmflabs.org and hopefully soon on Wikidata itself.
<
Lucas_WMDE whispers “hi everyone!”
*
Lydia_WMDE> We are currently only waiting on the security review and then we're ready to go.
<
Lydia_WMDE> This will be another important piece to help us all keep the data on Wikidata in good shape.
<
Auregann_WMDE> In the meantime, if you want to learn more about Shape Expressions, you can check the documentation on the wikiproject page https://www.wikidata.org/wiki/Wikidata:WikiProject_ShEx
<
Lydia_WMDE> Another thing that occupied a lot of the dev time was the termbox. That's the box that holds labels, descriptions and aliases.
<
Auregann_WMDE> also known as "the box that holds labels, descriptions and aliases"
<
Lydia_WMDE> It is currently not shown on the mobile view. It is currently also not possible to add a new language that isn't already there or in your prefered languages. We want to change both things.
<
Lydia_WMDE> And that requires quite some work behind the scenes to make it all come together.
<
Lydia_WMDE> The good thing is that as part of that we are experimenting with better technology that will help us in the future in a lot of other areas of Wikidata as well.
<
Lydia_WMDE> That will still take some time to complete but it's making good progress.
<
Lydia_WMDE> Then we wanted to be a bit more artsy and added a new datatype for musical notation so you can have pretty notes to illustrate songs and so on.
<
Lydia_WMDE> We also did a lot of small usability fixes for Lexemes and made them show up nicer on social networks. So when you now share a link to a Lexeme it'll get a nice card on Tiwtter and co.
<
Lydia_WMDE> you can try it here: https://cards-dev.twitter.com/validator
<
Lydia_WMDE> Another cool thing coming soon is the URL shortener.
<
superyetkin> what is the use case for URL shortening?
<
Lydia_WMDE> It's been discussed a lot in Wikimedia for a long time but it was especially important for Wikidata because the shortlinks to queries on the queryservice use tinyurl which is blacklisted on wiki so you couldn't share those links in discussions. Meh!
<
superyetkin> I see
<
Lydia_WMDE> Soon we will have our own one that only links to Wikimedia sites and not blocked.
<
Lydia_WMDE> Another thing we did that was requested is to add no-follow indicators to all the external identifier links in Wikidata so they are less attractive for search engine optimisation people who are spamming Wikidata
<
Lydia_WMDE> On the more hard-core technical side we investigated how to continue improving the wb_terms database table. It's one of the pieces that is getting a lot more painful for the developers as Wikidata grows. Since we don't want to tell people to stop adding things to Wikidata we need to spend time on that at the moment.
<
Lydia_WMDE> And last but not least we supported the WMF team working on Structured Data on Commons so they can make Commons better and helped a few large Wikibase installations so they can continue open up more data.
<
== Tulsi [uid192784@wikimedia/Tulsi-Bhagat] has quit [Quit: Updating details, brb]
Auregann_WMDE> Thanks Lydia! Plenty of things have been done during this first quarter of 2019 and more is to come :)
<
Auregann_WMDE> Now let's go through a few other things that are not directly development-related
<
Auregann_WMDE> First of all, Wikidata has two new admins: Esteban16, Stanglavine, welcome onboard!
<
== Tulsi [uid192784@wikimedia/Tulsi-Bhagat] has joined #wikimedia-office
Auregann_WMDE> During the Wikimedia Summit in Berlin, WMDE and WMF people have been discussing about the strategy behind Wikidata and Wikibase. We mostly wrote down things that we’ve been already doing or discussing with the community over the past years. The result of this will be published soon. The discussions were focused on 3 big areas: Wikidata for the Wikimedia projects, Wikidata as a platform, and the Wikibase
<
Auregann_WMDE> ecosystem.
<
Auregann_WMDE> once the documents are online, we'll be very happy to get your input :)
<
Auregann_WMDE> We asked you for input about issues with adding new language codes in Wikidata: https://www.wikidata.org/wiki/Wikidata:Identify_problems_with_adding_new_languages_into_Wikidata A wrap-up has been done, with a suggestion for next steps, now some community members should handle it if they want to :)
<
Auregann_WMDE> WikidataCon applications are open!!!1! You can now apply to participate, as well as submit program sessions or apply for a scholarship. Deadline is April 29th, don’t miss it, as it won’t be possible to get a seat after this date! https://www.wikidata.org/wiki/Topic:Uwnvvuwlnox50iw8
<
Auregann_WMDE> And of course you can read more details about the conference on this page and its subpages https://www.wikidata.org/wiki/Wikidata:WikidataCon_2019
<
Vigneron> \o/
<
Auregann_WMDE> Ever wondered how to optimize your sparql query so it runs faster and doesn’t time out? You can find some tips here https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/query_optimization#Optimization_strategies
<
Auregann_WMDE> JanAinali has been experimenting with live-streaming Wikidata editing on Twitch https://www.twitch.tv/janainali That's awesome, we want moar \o/
<
Auregann_WMDE> And now here’s one of my favorite sections of the meeting: presenting all the cool new tools developed by the community \o/
<
Auregann_WMDE> SpeedPatrolling, a tool helps Wikidata editors to patrol recent changes https://tools.wmflabs.org/speedpatrolling/
<
Auregann_WMDE> Related Properties, providing statistics about the usage of properties in Wikidata items https://tools.dicare.org/properties/
<
Auregann_WMDE> New dashboard giving the percentage of articles making use of data from Wikidata on all Wikimedia projects https://wdcm.wmflabs.org/WD_percentUsageDashboard/
<
Auregann_WMDE> A great dashboard showing property statistics for sum of all paintings: https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Property_statistics and now also on the Wikiproject Video games! Feel free to adapt it for your own project :)
<
Auregann_WMDE> During the Wikidata hackathon in Ulm, a bunch of cool projects have been developed. See for example the card game generator https://cardgame.morr.cc/ , a multiplayer game “guess the (German) politician” https://gtp.krmax44.de/
<
== Tulsi [uid192784@wikimedia/Tulsi-Bhagat] has quit [Quit: Updating details, brb]
Auregann_WMDE> (it's pretty hard, I fail all the time against my colleagues :( )
<
Auregann_WMDE> Also: a Telegram bot, because it’s a thing now! There is @WikidataMisfitBot, a game sending you pictures where one has a wrong label
<
Auregann_WMDE> But there’s also the Wikidata Search bot, @wkdt_bot, you can basically search Wikidata directly from Telegram
<
== Tulsi [uid192784@wikimedia/Tulsi-Bhagat] has joined #wikimedia-office
Auregann_WMDE> Speaking about Telegram, the Wikidata group is pretty active, more than IRC these days ;) https://t.me/joinchat/AZriqUj5Uag92TB4U9eBdQ
<
Auregann_WMDE> Back to tools: here’s the Wikidata mapping validator https://tools.wmflabs.org/tptools/wd_mapping_validator.php
<
Auregann_WMDE> A tool generating the shape of a country based of the location of its big cities https://cities.k-nut.eu/
<
Auregann_WMDE> And last but not least, the awesome Wikidata History Query Service, querying through the edit history of Wikidata! https://www.wikidata.org/wiki/Wikidata:History_Query_Service
<
Auregann_WMDE> Now, here is the traditional list of “recommended reading” that you will open in a tab, then forget about, then remember later :D
<
Auregann_WMDE> Lexicographical data on Wikidata: Words, words, words https://blog.wikimedia.de/2019/03/25/lexicographical-data-on-wikidata-words-words-words/
<
Auregann_WMDE> Inside the Alexa-friendly world of Wikidata https://www.wired.com/story/inside-the-alexa-friendly-world-of-wikidata/
<
Auregann_WMDE> Combining AI and Human Judgment to Build Knowledge about Art on a Global Scale https://www.metmuseum.org/blogs/now-at-the-met/2019/wikipedia-art-and-ai
<
Auregann_WMDE> Data Quality Management in Wikidata – Workshop write-up https://blog.wikimedia.de/2019/03/07/data-quality-management-in-wikidata-workshop-write-up/
<
Auregann_WMDE> Wikidata and the sum of all video games https://commonists.wordpress.com/2019/01/01/wikidata-and-the-sum-of-all-video-games-%E2%88%92-2018-edition/
<
Auregann_WMDE> Making Wikidata visible http://blogs.bodleian.ox.ac.uk/digital/2019/01/23/making-wikidata-visible/
<
Auregann_WMDE> What Wikidata offers Oxford’s GLAM Digital Strategy http://blogs.bodleian.ox.ac.uk/digital/2019/03/14/what-wikidata-offers-oxfords-glam-digital-strategy/
<
Auregann_WMDE> Let’s Tango: Computational Musicology Using Wikidata, MusicBrainz and the Wolfram Language https://blog.wolfram.com/2019/02/14/lets-tango-computational-musicology-using-wikidata-musicbrainz-and-the-wolfram-language/
<
Auregann_WMDE> That's it for now :)
<
Auregann_WMDE> (advertisement break: when you see a cool article or tool, don't forget to add it to the Weekly Summary! It makes sun shine and Léa happy)
<
Auregann_WMDE> And now Lydia_WMDE, what's going to happen next?
<
Lydia_WMDE> All the good stuff!
<
superyetkin> wow
<
Lydia_WMDE> We'll roll out support for Schemas as I said earlier.
<
Lydia_WMDE> We will also finish up the work on getting the termbox on mobile visible and editable.
<
Lydia_WMDE> We'll roll out a new level for the constraints so that you can indicate a constraint is only a suggestion instead of a big violation. This is for example useful if you want to say "if an item has a date of birth then it should probably also have a place of birth".
<
== eprodromou [~evan@wikimedia/EvanProdromou] has quit [Quit: eprodromou]
Lydia_WMDE> We will write the first code to get us to editing Wikidata's data directly from Wikipedia. This has been requested many many times and is a major hurdle still for using more of Wikidata's data in Wikipedia and co.
<
Lydia_WMDE> We will also look into planning and design for signed statements. This is another thing that'll help us keep data quality high by making it harder to change a statement while still keeping the old reference.
<
Lydia_WMDE> And last but not least we will overhaul the website for Wikibase so people can actually understand what it is and why they'd want to use it.
<
Auregann_WMDE> Such good stuff \o/
<
Auregann_WMDE> Now, we still have 30 minutes for questions and discussions
<
Auregann_WMDE> So, feel free to ask anything you want :)
<
superyetkin> when I look at https://www.wikidata.org/wiki/Q388995#sitelinks-wikipedia I can see there are two values for population
<
superyetkin> how can I fetch the date for those properties?
<
superyetkin> I mean the literals 2015 and 2018
<
== katpatuka [~roman@85.97.216.136] has joined #wikimedia-office
Lydia_WMDE> Those are the qualifiers.
<
Lydia_WMDE> Do you want to use them in Lua?
<
Lydia_WMDE> Or?
<
superyetkin> on Wi
<
superyetkin> on WQikipedia
<
superyetkin> through Lua or templates
<
superyetkin> parser functions
<
Lydia_WMDE> Unfortunately the parser function doesn't give you access to that.
<
superyetkin> we are currently displaying the up-to-date populations of cities but not the years associated with them
<
Lydia_WMDE> It only returns the value with the "best" rank. In this case the first.
<
Lydia_WMDE> So it is only possible in Lua.
<
superyetkin> https://tr.wikipedia.org/wiki/Trabzon shows the population but not the year :(
<
Lydia_WMDE> Auregann_WMDE is trying to find some documentation
<
Vigneron> superyetkin: https://tr.wikipedia.org/wiki/Trabzon the infobox is not using Wikidata in this article :/
<
superyetkin> Lydia, we are using it :)
<
superyetkin> look at the value in the "nüfus_toplam" parameter and you will see
<
superyetkin> |nüfus_toplam = 779.379 and the actual Wikidata value is 807,903, which is correct
<
superyetkin> we are trying to fetch as much data as we can from Wikidatas
<
Tpt[m]> "We will also look into planning and design for signed statements." That's great!
<
Auregann_WMDE> superyetkin: you can have a look here https://www.mediawiki.org/wiki/Extension:Wikibase_Client/Lua#mw.wikibase.getBestStatements to get all the content of statements, including qualifiers
<
Lydia_WMDE> Tpt[m]: \o/ If you have input for it I'd love to hear it in an email or call.
<
Spinster> I'm also very interested in signed statements, and I know some GLAMs who would like to try them if that's helpful
<
Lydia_WMDE> Spinster: absolutely!
<
Lydia_WMDE> Spinster: send me all the things :)
<
sjoerddebruin> what's the eta for constraints in the query service?
<
superyetkin> thanks Lydia, do you know a Lua module using that client library?
<
Spinster> So in the upcoming quarter you are going to work on planning/design for signed statements, with the intention to start implementing later in the year if that works out?
<
Auregann_WMDE> superyetkin: I'm looking at the code of the infobox but i see that the number is written locally, is there something I'm missing here?
<
superyetkin> creating one from scratch would be really hard at the moment :)
<
Lydia_WMDE> sjoerddebruin: very good question. My last status update was that it's blocked on a service we need which needs some coordination with the WMF and an RFC. Let me get the link for that.
<
superyetkin> Auregann: |nüfus_toplam = 779.379 and the actual Wikidata value is 807,903, which is correct
<
superyetkin> the value in the infobox parameter is outdated
<
Lydia_WMDE> sjoerddebruin: https://phabricator.wikimedia.org/T214362 needs to get unstuck. I'll poke about it again.
<
Lydia_WMDE> Spinster: yes
<
Auregann_WMDE> superyetkin: so you're editing the value by hand, not displaying it automatically from Wikidata?
<
sjoerddebruin> Lydia_WMDE: great
<
superyetkin> no, we are fetching it from Wikidata
<
Auregann_WMDE> superyetkin: here https://en.wikipedia.org/wiki/South_Pole_Telescope you can see that the "construction" values come from qualifiers on https://www.wikidata.org/wiki/Q1513315
<
Auregann_WMDE> superyetkin: and here's the template itself https://en.wikipedia.org/wiki/Template:Infobox_telescope
<
Auregann_WMDE> superyetkin: how? with a bot?
<
superyetkin> hmm.. {{#invoke:WikidataIB |getQualifierValue |P79 will help, I think
<
superyetkin> thanks for that
<
Tpt[m]> Lydia_WMDE: I have not really an opinion except that it should be easy for third party code to compute the signature.
<
Lydia_WMDE> Tpt[m]: *nod*
<
Tpt[m]> so, it should not rely on PHP specific thing unlike some hashes used in the datamodel
<
chicagohil> Lydia_WMDE: I'm interested in signed statements as well and am working with some libraries that would be interested in them. Is there any documentation about that project already?
<
Lydia_WMDE> chicagohil: so far not really. If you want i'm happy to have a chat with you.
<
chicagohil> that would be great. Thank you
<
Lydia_WMDE> superyetkin: https://www.wikidata.org/wiki/User:HakanIST might be a good person to reach out to for some help
<
Lydia_WMDE> chicagohil: cool. i'll send you a link to my calendar in a private chat.
<
pintoch> actually, I'm not too sure about signed statements, if I can bring in a different note ^
<
pintoch> (yay I can finally speak!)
<
superyetkin> yes, I know HakanÝST, lydia
<
Lydia_WMDE> pintoch: heh I want to hear that too obviously
<
Auregann_WMDE> yay pintoch \o/
<
pintoch> I don't know, is public key cryptography really the right solution for the problem of discarding references when statement values change?
<
pintoch> thanks Auregann_WMDE! :)
<
Lydia_WMDE> pintoch: *nod* let's also find time for a longer chat about it?
<
Lydia_WMDE> I'm currently in the phase of gathering all the input so some longer chats really help me
<
pintoch> sure! for me this is more of an UI issue really (but given the enthusiasm above I might well be wrong!)
<
Lydia_WMDE> hehe
<
Lydia_WMDE> superyetkin: great! If that works for you I recommend talking to him.
<
Spinster> I think it's really good to go to the 'why' of signed statements. I know that GLAMs want a way to indicate that certain statements are truly backed by them. By which technical means we support that, is another issue
<
superyetkin> we are already talking, Lydia :)
<
Auregann_WMDE> btw superyetkin if you want to start a Wikidata-powered infobox from scratch, feel free to look at DataBox https://www.wikidata.org/wiki/Module:Databox
<
superyetkin> Turkish Wikipedia user group is not very large due to the Wikipedia ban in Turkey :(
<
superyetkin> that affects Wikidata as well
<
Lydia_WMDE> :(
<
Vigneron> (for people like me who just heard of "signed statements" where can I find some more basic explanation of what it is and how it works?)
<
superyetkin> Auregann_WMDE: thanks for the link, I will examine it
<
Scott_WUaS> whispers “hi everyone!” & "thank you!" for these great Wikidata developments
<
== Cannot send to nick/channel: #wikimedia-office
Auregann_WMDE> Vigneron: Here :) https://www.wikidata.org/wiki/Wikidata:Development_plan#Signed_statements
<
Vigneron> Auregann_WMDE: thanks !
<
Auregann_WMDE> A few minutes left, do you have any other questions? :)
<
Auregann_WMDE> btw if someone has knowledge in Lua, could you look at the issue raised by katpatuka on the #wikidata channel? I quote: "what may be the cause of "Lua错误:not enough memory。" errors on zh.wiki ? I get it for example on zh:新立街道 (七台河市) . Removing template {{PRC admin/navcat|23/09... removes the error"
<
Auregann_WMDE> Alright, then I have a question :D What are the cool things you're working on these days?
<
sjoerddebruin> A Wikimedian-in-Residence project for documenting Dutch university teachers. :)
<
Lydia_WMDE> Nice!
<
sjoerddebruin> One of the first Wikidata-specific ones afaik
<
Spinster> I can +1000 that that is very cool
<
Auregann_WMDE> Fancy!
<
Vigneron> The WikidataCon :P
<
Auregann_WMDE> Excellent choice ;)
<
superyetkin> are there any plans to organize WikidataCon online?
<
Lucas_WMDE> I’m working on some improvements to SpeedPatrolling – work around that annoying “page sometimes blank” issue, suggest thanking users you’ve frequently patrolled, suggest requesting blocks for users you’ve frequently rolled back
<
superyetkin> considering the carbon footprint of the Wikidata project...
<
superyetkin> I can remember reading something about this on Meta
<
Vigneron> and other wiki-conferences (there is a lot comming: WikiEdu, Hackathon, Celtic Knot, Wikimania, and so on let's spread the Wikidata gospel there too \o/ )
<
Lydia_WMDE> Lucas_WMDE: post a link! :D
<
Spinster> The first WikidataCon had truly excellent video recording/streaming, are you thinking about doing that again?
<
Auregann_WMDE> superyetkin: just like in 2017, we'll make sure that the main tracks will be live-streamed and available in video after the conference
<
chicagohil> Working on how we can use Wikidata in a knowledge panel in our library catalog
<
Auregann_WMDE> for discussions and spontaneous networking, it's slightly more complicated
<
Vigneron> superyetkin: great question! at the very least there should be some recording online, maybe in realtime but we should think on how to do more active online participation
<
Lydia_WMDE> chicagohil: yay
<
superyetkin> Vigneron: I am looking forward to it
<
Vigneron> chicagohil: very nice!
<
Lucas_WMDE> Lydia_WMDE: well the link to the tool was already posted earlier, https://tools.wmflabs.org/speedpatrolling/
<
Lucas_WMDE> just working on improvements
<
Lydia_WMDE> :)
<
Lucas_WMDE> (private time yadda yadda)
<
Auregann_WMDE> also, we plan to keep alterning between one big conference in Berlin, and plenty of decentralized events like in 2018
<
Vigneron> chicagohil: I'd like very much to see the result of that, is there already something we can see?
<
bawolff> On subject of video streaming, if you have live streaming for a tech related conference, please link in sitenotice - I really want to promote remote participation options in conferences to new people/people not yet in our communities
<
superyetkin> also, what about plans for green servers for Wikimedia projects?
<
superyetkin> will that include Wikidata as well?
<
Auregann_WMDE> so, in 2020 you'll have the opportunity to organize your own meetup and reach local community - and spend less ozone-killing energy :)
<
superyetkin> bawolff: I completely agree with this
<
Auregann_WMDE> bawolff: good idea
<
chicagohil> Vigneron: not yet, but we'll have a development cycle starting in a few weeks, so I should have something to share by the beginning of May
<
Vigneron> superyetkin: do you know https://meta.wikimedia.org/wiki/Sustainability_Initiative ?
<
Lucas_WMDE> it looks like our eqiad datacenter should actually use renewable energy? https://en.wikipedia.org/wiki/Equinix#Sustainability
<
bawolff> I did it with the recent EMWCon, and i think it attracted more viewers
<
Lucas_WMDE> though not all our datacenters are with Equinix
<
Lucas_WMDE> but I’d assume eqiad is the most energy intensive one
<
superyetkin> Vigneron: yes, I remember that
<
superyetkin> good to hear that, Lydia
<
Auregann_WMDE> Alright people, let's close the official meeting for now- of course you can keep discussing after that ;)
<
Auregann_WMDE> Thanks a lot for your constructive questions and feedback
<
Auregann_WMDE> You can stay in touch with us anytime onwiki or various channels
<
superyetkin> thanks
<
Auregann_WMDE> See you soon!
<
Auregann_WMDE> #endmeeting
<
wm-labs-meetbot> Meeting ended Tue Apr 9 17:36:38 2019 UTC. Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4)
<
wm-labs-meetbot> Minutes: https://tools.wmflabs.org/meetbot/wikimedia-office/2019/wikimedia-office.2019-04-09-16.30.html
<
wm-labs-meetbot> Minutes (text): https://tools.wmflabs.org/meetbot/wikimedia-office/2019/wikimedia-office.2019-04-09-16.30.txt
<
wm-labs-meetbot> Minutes (wiki): https://tools.wmflabs.org/meetbot/wikimedia-office/2019/wikimedia-office.2019-04-09-16.30.wiki
<
wm-labs-meetbot> Log: https://tools.wmflabs.org/meetbot/wikimedia-office/2019/wikimedia-office.2019-04-09-16.30.log.html
<
== heatherw [~administr@wikimedia/heatherawalls] has quit [Quit: heatherw]
Lydia_WMDE> Thanks everyone :)
<
Lydia_WMDE> o/
<
== jwslu [~jwslu@88.98.199.221] has quit [Ping timeout: 246 seconds]
== Vigneron [55aa74cf@gateway/web/freenode/ip.85.170.116.207] has quit [Quit: Page closed]
fuzheado> bawolff: Nice. Do you have any details on the setup the EMWCon used for video? Nice use of live video and presentation sldies
<
bawolff> I think they used a company called next day video
<
bawolff> CindyCicaleseWMF would know more
<
fuzheado> Ah, they hired out? Looks pro
<
bawolff> I think the person they hired may have been associated with the conference though, so not sure how much it was "hired out"
<
bawolff> But i agree they did a good job
<
== eprodromou [~evan@wikimedia/EvanProdromou] has joined #wikimedia-office
bawolff> which reminds me i need to review the video of my thing so it gets released on youtube
<
== SkarmoutsosV [~Skarmouts@194.219.33.43] has joined #wikimedia-office
== Lucas_WMDE [Lucas_WMDE@nat/wmf/x-acwsdgrcvtqrfozb] has left #wikimedia-office ["Good Bye"]
== heatherw [~administr@wikimedia/heatherawalls] has joined #wikimedia-office
== No such nick/channel: Vigneron
== End of WHOIS
== fuzheado [~fuzheado@wikipedia/Fuzheado] has quit [Quit: fuzheado]
== jgleeson [~jgleeson@wikimedia/jgleeson-wmf] has joined #wikimedia-office
== katpatuka [~roman@85.97.216.136] has left #wikimedia-office ["see you"]
*
Hi Lydia, Léa, Kate, Katherine (and all),
Am re-sending this email inquiry about WUaS communication with Wikidata since I got a bounce-back from Léa's email (I may have mis-typed this).
I also blogged about this here:
Am presently working on initial licensing / accrediting WUaS for Bachelor / Ph.D. degrees in English with the state of California's BPPE, and it may make sense to develop our WUaS Miraheze MediaWiki with Wikidata / Wikibase in the process - for students' majors and programs - re a kind of MIT OpenCourseWare WUaS course catalog on each of our WUaS subject pages, for example, and in other ways too - and especially in anticipating ~200 countries' main / official languages re Wikipedia's ~300 languages, eventually with machine translation.
Cheers, and thank you,
Scott
* *
And this could possibly partially be a response to some of my questions above:
[Wikitech-l] Discovery Weekly Update for the week starting 2019-03-25
Inbox
| x |
|
1:34 PM (4 minutes ago)
| |||
Greetings,
This is the weekly update from the Search Platform team for the week
starting 2019-03-25 and 2019-04-01.
As always, feedback and questions are welcome.
== Discussions ==
=== Search ===
* ElasticSearch upgrade to v6:
** incident [0]
*Trey finished a deep dive into the performance of language
identification for cross-wiki searching [1] (example [2]) and
punctuation-related problems, and discovered things are working pretty
well overall, but the Chinese language model is a bit off.
* Erik noticed that the inlabel / incaption keywords should highlight
the label/caption but were not [3]
* David worked on fixing an error code that Elasticsearch 6
nested_path and nested_filter are deprecated [4] and
_retry_on_conflict was deprecated [5]
* We worked on migrating mjolnir to stdout/syslog/cee logging output [6]
* The team worked on upgrade to elasticsearch 6.5.4 for cirrus / codfw
(specifically) [7] and for eqiad [8]
* Erik worked on the implementation and testing of glent m0
integration with wmf infrastructure [9]
* David did a lot of work to update the mw-config to use the psi&omega
elastic clusters [10]
* David found that the auto_generate_phrase_queries is deprecated and
ineffective [11]
* The team fixed an old bug where we were getting fatal errors -
"cannot perform this operation with arrays" from
CirrusSearch/ElasticaWrite (using JobQueueDB) [12]
* Gehel worked to make spicerack more robust when unfreezing writes to
elasticsearch / cirrus [13] as well as creating a cookbook to reset
frozen write state on elasticsearch / cirrus [14]
* Stas moved WikibaseLexeme search code to WikibaseLexemeCirrusSearch
extension [15]
* We noticed that Elasticsearch indices went read-only, causing a huge lag [16]
* We also saw where search exceptions handling was printing response
information on the screen [17]
* The team fixed an issue where mwgrep was not working [18]
* We also fixed an issue where Elasticsearch 6 needed to silence
deprecation warnings to avoid logspam [19]
* We needed to create an extra elasticsearch clusters in the beta cluster [20]
* We also needed some alerts so we know if mjolnir starts misbehaving [21]
* We also converted check_elasticsearch.py icinga plugin to py3 [22]
* We needed to start using local nginx reverse proxy for connections reuse [23]
* The version of curator that we currently use (5.2.0) isn't
compatible with elasticsearch 6. Which causes issues in a few cron on
logtash servers (see blelow). Version 5.6.0 supports both
elasticsearch 5 and 6.....so...we updated it [24]
* We also did some cleanup of the reprepro configuration for
elasticsearch-curator [25]
* Getting a centralized way to inspect the content of the search
profiles might be helpful when investigating search behaviors. In the
same vein as other dump debug APIs (mapping/settings/cirrusdoc) David
suggested that we should add a new simple API to dump the profiles
(cirrus-profiles-dump) [26]
* David also found that a call to a member function toArray() on a
non-object (null) in
vendor/ruflin/elastica/lib/Ela stica/Client.php:736 and fixed it [27]
[0] https://wikitech.wikimedia.org /wiki/Incident_documentation/ 20190327-elasticsearch
report
[1] https://www.mediawiki.org/wiki /User:TJones_(WMF)/Notes/Revie w_of_Language_Identification_ in_Production,_with_a_Special_ Focus_on_Stupid_ Identification_Tricks
[2] https://en.wikipedia.org/w/ind ex.php?search=%D0%93%D0%B0%D1% 80%D1%80%D0%B8+%D0%9F%D0%BE% D1%82%D1%82%D0%B5%D1%80%D0%B5
[3] https://phabricator.wikimedia. org/T217809
[4] https://phabricator.wikimedia. org/T219266
[5] https://phabricator.wikimedia. org/T219265
[6] https://phabricator.wikimedia. org/T218833
[7] https://phabricator.wikimedia. org/T218878
[8] https://phabricator.wikimedia. org/T218879
[9] https://phabricator.wikimedia. org/T218164
[10] https://phabricator.wikimedia. org/T210381
[11] https://phabricator.wikimedia. org/T219267
[12] https://phabricator.wikimedia. org/T124196
[13] https://phabricator.wikimedia. org/T219640
[14] https://phabricator.wikimedia. org/T219638
[15] https://phabricator.wikimedia. org/T216206
[16] https://phabricator.wikimedia. org/T219364
[17] https://phabricator.wikimedia. org/T216959
[18] https://phabricator.wikimedia. org/T219162
[19] https://phabricator.wikimedia. org/T219269
[20] https://phabricator.wikimedia. org/T213940
[21] https://phabricator.wikimedia. org/T214494
[22] https://phabricator.wikimedia. org/T215439
[23] https://phabricator.wikimedia. org/T215491
[24] https://phabricator.wikimedia. org/T218991
[25] https://phabricator.wikimedia. org/T216235
[26] https://phabricator.wikimedia. org/T218682
[27] https://phabricator.wikimedia. org/T217402
----
Subscribe to receive on-wiki (or opt-in email) notifications of the
Discovery weekly update.
https://www.mediawiki.org/wiki /Newsletter:Discovery_Weekly
The archive of all past updates can be found on MediaWiki.org:
https://www.mediawiki.org/wiki /Discovery/Status_updates
Interested in getting involved? See tasks marked as "Easy" or
"Volunteer needed" in Phabricator.
[1] https://phabricator.wikimedia. org/maniphest/query/qW51XhCCd8 .7/#R
[2] https://phabricator.wikimedia. org/maniphest/query/5KEPuEJh9T PS/#R
Yours,
Chris Koerner (he/him)
Community Relations Specialist
Wikimedia Foundation
______________________________ _________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/ma ilman/listinfo/wikitech-l
This is the weekly update from the Search Platform team for the week
starting 2019-03-25 and 2019-04-01.
As always, feedback and questions are welcome.
== Discussions ==
=== Search ===
* ElasticSearch upgrade to v6:
** incident [0]
*Trey finished a deep dive into the performance of language
identification for cross-wiki searching [1] (example [2]) and
punctuation-related problems, and discovered things are working pretty
well overall, but the Chinese language model is a bit off.
* Erik noticed that the inlabel / incaption keywords should highlight
the label/caption but were not [3]
* David worked on fixing an error code that Elasticsearch 6
nested_path and nested_filter are deprecated [4] and
_retry_on_conflict was deprecated [5]
* We worked on migrating mjolnir to stdout/syslog/cee logging output [6]
* The team worked on upgrade to elasticsearch 6.5.4 for cirrus / codfw
(specifically) [7] and for eqiad [8]
* Erik worked on the implementation and testing of glent m0
integration with wmf infrastructure [9]
* David did a lot of work to update the mw-config to use the psi&omega
elastic clusters [10]
* David found that the auto_generate_phrase_queries is deprecated and
ineffective [11]
* The team fixed an old bug where we were getting fatal errors -
"cannot perform this operation with arrays" from
CirrusSearch/ElasticaWrite (using JobQueueDB) [12]
* Gehel worked to make spicerack more robust when unfreezing writes to
elasticsearch / cirrus [13] as well as creating a cookbook to reset
frozen write state on elasticsearch / cirrus [14]
* Stas moved WikibaseLexeme search code to WikibaseLexemeCirrusSearch
extension [15]
* We noticed that Elasticsearch indices went read-only, causing a huge lag [16]
* We also saw where search exceptions handling was printing response
information on the screen [17]
* The team fixed an issue where mwgrep was not working [18]
* We also fixed an issue where Elasticsearch 6 needed to silence
deprecation warnings to avoid logspam [19]
* We needed to create an extra elasticsearch clusters in the beta cluster [20]
* We also needed some alerts so we know if mjolnir starts misbehaving [21]
* We also converted check_elasticsearch.py icinga plugin to py3 [22]
* We needed to start using local nginx reverse proxy for connections reuse [23]
* The version of curator that we currently use (5.2.0) isn't
compatible with elasticsearch 6. Which causes issues in a few cron on
logtash servers (see blelow). Version 5.6.0 supports both
elasticsearch 5 and 6.....so...we updated it [24]
* We also did some cleanup of the reprepro configuration for
elasticsearch-curator [25]
* Getting a centralized way to inspect the content of the search
profiles might be helpful when investigating search behaviors. In the
same vein as other dump debug APIs (mapping/settings/cirrusdoc) David
suggested that we should add a new simple API to dump the profiles
(cirrus-profiles-dump) [26]
* David also found that a call to a member function toArray() on a
non-object (null) in
vendor/ruflin/elastica/lib/Ela
[0] https://wikitech.wikimedia.org
report
[1] https://www.mediawiki.org/wiki
[2] https://en.wikipedia.org/w/ind
[3] https://phabricator.wikimedia.
[4] https://phabricator.wikimedia.
[5] https://phabricator.wikimedia.
[6] https://phabricator.wikimedia.
[7] https://phabricator.wikimedia.
[8] https://phabricator.wikimedia.
[9] https://phabricator.wikimedia.
[10] https://phabricator.wikimedia.
[11] https://phabricator.wikimedia.
[12] https://phabricator.wikimedia.
[13] https://phabricator.wikimedia.
[14] https://phabricator.wikimedia.
[15] https://phabricator.wikimedia.
[16] https://phabricator.wikimedia.
[17] https://phabricator.wikimedia.
[18] https://phabricator.wikimedia.
[19] https://phabricator.wikimedia.
[20] https://phabricator.wikimedia.
[21] https://phabricator.wikimedia.
[22] https://phabricator.wikimedia.
[23] https://phabricator.wikimedia.
[24] https://phabricator.wikimedia.
[25] https://phabricator.wikimedia.
[26] https://phabricator.wikimedia.
[27] https://phabricator.wikimedia.
----
Subscribe to receive on-wiki (or opt-in email) notifications of the
Discovery weekly update.
https://www.mediawiki.org/wiki
The archive of all past updates can be found on MediaWiki.org:
https://www.mediawiki.org/wiki
Interested in getting involved? See tasks marked as "Easy" or
"Volunteer needed" in Phabricator.
[1] https://phabricator.wikimedia.
[2] https://phabricator.wikimedia.
Yours,
Chris Koerner (he/him)
Community Relations Specialist
Wikimedia Foundation
______________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/ma
* *
I see the following 2 emails from Wikidata community as being responses to questions above:
*
[Wikitech-l] MediaWiki 1.33 branch announcement
|
Wed, Apr 10, 5:12 PM (1 day ago)
| |||
Hello!
We are beginning the process of working on the 1.33.0 release of
MediaWiki. The release is currently scheduled for May 2019.
We created the REL1_33 branch this week (aligned with the final
Wikimedia alpha, 1.33.0-wmf.25), and will be generating the first
release candidate soon, so this means it's now "pencils down".
If you have any open Phabricator tasks tagged with mw-1.33-release
[0], please check to see if they are indeed blockers for the release.
If not, please remove the tag from them. Conversely, if there are any
blockers that are not tagged with mw-1.33.release, please tag them.
Please feel free to reach out to me if you have any questions about
this.
We will be in the normal patch master + backport process from now
until the release in May.
Thanks,
Greg
[0] https://phabricator.wikimedia. org/tag/mw-1.33-release/
We are beginning the process of working on the 1.33.0 release of
MediaWiki. The release is currently scheduled for May 2019.
We created the REL1_33 branch this week (aligned with the final
Wikimedia alpha, 1.33.0-wmf.25), and will be generating the first
release candidate soon, so this means it's now "pencils down".
If you have any open Phabricator tasks tagged with mw-1.33-release
[0], please check to see if they are indeed blockers for the release.
If not, please remove the tag from them. Conversely, if there are any
blockers that are not tagged with mw-1.33.release, please tag them.
Please feel free to reach out to me if you have any questions about
this.
We will be in the normal patch master + backport process from now
until the release in May.
Thanks,
Greg
[0] https://phabricator.wikimedia.
*
When this is available for Miraheze MediaWiki, we'll be able to begin develop WUaS Miraheze MediaWiki (as "front end") ... and probably with WUaS in Wikidata (as "back end").
* *
[Wikimedia-l] Reviewing our brand system for our 2030 goals
|
5:37 AM (12 hours ago)
| |||
Hello all,
I'm forwarding this information, in case you didn't see this discussion yet.
In short, the Wikimedia Foundation is considering rebranding the organization as well as the projects from "Wikimedia" to "Wikipedia".
If you have an opinion on this, feel free to voice it on the Meta page, or directly to their team by email, until May 2019. https://meta.wikimedia.org/ wiki/Communications/Wikimedia_ brands/2030_research_and_ planning/community_review
For more information, you can also check the blog post https://wikimediafoundation. org/2019/02/26/leading-with- wikipedia-a-brand-proposal- for-2030/ and the report from the agency https://meta.wikimedia.org/ wiki/File:Overview_of_2030_ Wikimedia_brand_research_and_ planning.pdf
Cheers, Léa
---------- Forwarded message ---------
From: Zack McCune <zmccune@wikimedia.org>
Date: Tue, 26 Feb 2019 at 04:15
Subject: [Wikimedia-l] Reviewing our brand system for our 2030 goals
To: <wikimediaannounce-l@lists. wikimedia.org>, Wikimedia Mailing List <wikimedia-l@lists.wikimedia. org>
From: Zack McCune <zmccune@wikimedia.org>
Date: Tue, 26 Feb 2019 at 04:15
Subject: [Wikimedia-l] Reviewing our brand system for our 2030 goals
To: <wikimediaannounce-l@lists.
*
--
Léa Lacroix
Project Manager Community Communication for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/029/42207.
Project Manager Community Communication for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/029/42207.
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/ma
*
|
Feb 25, 2019, 7:14 PM
| |||
:: Apologies for cross-posting to multiple mailing lists. We want to ensure
we spread the word about this opportunity to as many people as possible. ::
Hi all,
We are writing today to invite you to be a part of a community review on
Wikimedia brand research and strategy.
Recently, the Wikimedia Foundation set out to better understand how the
world sees Wikimedia and Wikimedia projects as brands.[1] We wanted to get
a sense of the general visibility of our different projects, and evaluate
public support of our mission to spread free knowledge.
We launched a global brand study to research these questions, as part of
our planning toward our 2030 strategic goals.[2] The study was commissioned
by the Board, carried out by the brand consultancy Wolff Olins, and
directed by the Foundation’s Communications team.[3][4] It collected
perspectives from the internet users of seven countries (India, China,
Nigeria, Egypt, Germany, Mexico and the US) on Wikimedia projects and
values.
The study revealed some interesting trends:
- Awareness of Wikipedia is above 80% in Western Europe and North America.
- Awareness of Wikipedia averages above 40% in emerging markets,[5] and is
fast growing.
- There is awareness of other projects, but was significantly lower. For
example, awareness of Wikisource was at 30%, Wiktionary at 25%, Wikidata at
20%, and Wikivoyage at 8%.
- There was significant confusion around the name Wikimedia. Respondents
reported they had either not heard of it, or extrapolated its relationship
to Wikipedia.
- In spite of lack of awareness about Wikimedia, respondents showed a high
level of support for our mission.
Following from these research insights, the Wolff Olins team also made a
strategic suggestion to refine the Wikimedia brand system.[6] The
suggestions include:
- Use Wikipedia as the central movement brand rather than Wikimedia.
- Provide clearer connections to the Movement projects from Wikipedia to
drive increased awareness, usage and contributions to smaller projects.
- Retain Wikimedia project names, with the exception of Wikimedia Commons
which is recommended to be shortened to Wikicommons to be consistent with
other projects.
- Explore new naming conventions for the Foundation and affiliate groups
that use Wikipedia rather than Wikimedia.
- Consider expository taglines and other naming conventions to reassert the
connections between projects (e.g. “______ - A Wikipedia project”).
This is not a new idea.[7][8]
By definition, Wikimedia brands are shared among the communities who give
them meaning. So in considering this change, the Wikimedia Foundation is
collecting feedback from across our communities. Our goal is to speak with
more than 80% of affiliates and as many individual contributors as possible
before May 2019, when we will offer the Board of Trustees a summary of
community response.
We invite you to look at a project summary [9], the brand research [10],
and the brand strategy suggestion [11] Wolff Olins prepared working with us.
For feedback, please add comments on the Community Review talk page [12] or
email brandproject@wikimedia.org with direct feedback. You can also use
either of these channels to request to join a group meeting.
We know this is big topic and we’re excited to hear from you!
- Zack McCune and the Wikimedia Foundation Communications department
[1]
https://wikimediafoundation.or g/2019/02/07/how-does-the-worl d-see-wikimedia-brands/
[2] https://meta.wikimedia.org/wik i/Strategy/Wikimedia_movement/ 2018-20
[3] https://www.wolffolins.com/
[4] https://meta.wikimedia.org/wik i/Communications
[5]
https://meta.wikimedia.org/wik i/Community_Engagement/Definin g_Emerging_Communities
[6]
https://wikimediafoundation.or g/2019/02/26/leading-with-wiki pedia-a-brand-proposal-for- 2030/
[7] https://lists.wikimedia.org/pi permail/foundation-l/2007-May/ 029991.html
[8]
https://commons.wikimedia.org/ w/index.php?title=File%3AStren gthening_and_unifying_the_ visual_identity_of_Wikimedia_ projects_-_a_step_towards_ maturity_-_Wikimania_2007.pdf& page=56
[9]
https://meta.wikimedia.org/wik i/Communications/Wikimedia_bra nds/2030_research_and_planning /project_summary
[10]
https://commons.wikimedia.org/ wiki/File:Global_Wikipedia_and _Wikimedia_Brand_Research_Repo rt.pdf
[11]
https://commons.wikimedia.org/ wiki/File:A_Wikimedia_brand_st rategy_proposal_for_2030.pdf
[12]
https://meta.wikimedia.org/wik i/Talk:Communications/Wikimedi a_brands/2030_research_and_ planning/community_review
we spread the word about this opportunity to as many people as possible. ::
Hi all,
We are writing today to invite you to be a part of a community review on
Wikimedia brand research and strategy.
Recently, the Wikimedia Foundation set out to better understand how the
world sees Wikimedia and Wikimedia projects as brands.[1] We wanted to get
a sense of the general visibility of our different projects, and evaluate
public support of our mission to spread free knowledge.
We launched a global brand study to research these questions, as part of
our planning toward our 2030 strategic goals.[2] The study was commissioned
by the Board, carried out by the brand consultancy Wolff Olins, and
directed by the Foundation’s Communications team.[3][4] It collected
perspectives from the internet users of seven countries (India, China,
Nigeria, Egypt, Germany, Mexico and the US) on Wikimedia projects and
values.
The study revealed some interesting trends:
- Awareness of Wikipedia is above 80% in Western Europe and North America.
- Awareness of Wikipedia averages above 40% in emerging markets,[5] and is
fast growing.
- There is awareness of other projects, but was significantly lower. For
example, awareness of Wikisource was at 30%, Wiktionary at 25%, Wikidata at
20%, and Wikivoyage at 8%.
- There was significant confusion around the name Wikimedia. Respondents
reported they had either not heard of it, or extrapolated its relationship
to Wikipedia.
- In spite of lack of awareness about Wikimedia, respondents showed a high
level of support for our mission.
Following from these research insights, the Wolff Olins team also made a
strategic suggestion to refine the Wikimedia brand system.[6] The
suggestions include:
- Use Wikipedia as the central movement brand rather than Wikimedia.
- Provide clearer connections to the Movement projects from Wikipedia to
drive increased awareness, usage and contributions to smaller projects.
- Retain Wikimedia project names, with the exception of Wikimedia Commons
which is recommended to be shortened to Wikicommons to be consistent with
other projects.
- Explore new naming conventions for the Foundation and affiliate groups
that use Wikipedia rather than Wikimedia.
- Consider expository taglines and other naming conventions to reassert the
connections between projects (e.g. “______ - A Wikipedia project”).
This is not a new idea.[7][8]
By definition, Wikimedia brands are shared among the communities who give
them meaning. So in considering this change, the Wikimedia Foundation is
collecting feedback from across our communities. Our goal is to speak with
more than 80% of affiliates and as many individual contributors as possible
before May 2019, when we will offer the Board of Trustees a summary of
community response.
We invite you to look at a project summary [9], the brand research [10],
and the brand strategy suggestion [11] Wolff Olins prepared working with us.
For feedback, please add comments on the Community Review talk page [12] or
email brandproject@wikimedia.org with direct feedback. You can also use
either of these channels to request to join a group meeting.
We know this is big topic and we’re excited to hear from you!
- Zack McCune and the Wikimedia Foundation Communications department
[1]
https://wikimediafoundation.or
[2] https://meta.wikimedia.org/wik
[3] https://www.wolffolins.com/
[4] https://meta.wikimedia.org/wik
[5]
https://meta.wikimedia.org/wik
[6]
https://wikimediafoundation.or
[7] https://lists.wikimedia.org/pi
[8]
https://commons.wikimedia.org/
[9]
https://meta.wikimedia.org/wik
[10]
https://commons.wikimedia.org/
[11]
https://commons.wikimedia.org/
[12]
https://meta.wikimedia.org/wik
* *
A:
Emailing directly to a team (and not to the whole email list) has merit, and is something WUaS is beginning to do with Wikidatans.
B:
And, having donated WUaS to Wikidata for co-development in 2015, this thus raises questions about about World University and School branding as well, and with regards to Wikipedia.
*
...
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.