Jump to content

Wikipedia talk:The Core Contest

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
(Redirected from Wikipedia talk:TCC)

Who wants to organise this year?

[edit]

My health is going up and down a bit too much to be a reliable organiser (should have posted in January). Would be a shame if my tardiness means there is no core contest this year. I'm willing to judge still. —Femke 🐦 (talk) 14:56, 10 March 2024 (UTC)[reply]

I had a great time judging last year, and would love to take a more lead-role as organiser/judge. Aza24 (talk) 16:50, 10 March 2024 (UTC)[reply]
(takes two steps back) okay @Aza24: happy for you to don the captain's hat this year! And reporting for duty! Cas Liber (talk · contribs) 07:45, 20 March 2024 (UTC)[reply]
Great! Will get things set up soon. Aza24 (talk) 07:47, 20 March 2024 (UTC)[reply]
And obviously any questions you have we'll be happy to answer. Lead on! Cas Liber (talk · contribs) 07:53, 20 March 2024 (UTC)[reply]

TCC 2024: Dates?

[edit]

Hi all! We're starting a little later than last year, but I think that 15 April to 31 May, like last year would be fine.

We'd still have a good three weeks to spread the word and rack up entries. Any thoughts? Aza24 (talk) 19:18, 22 March 2024 (UTC)[reply]

I intend to participate, and this sounds good to me. Thebiguglyalien (talk) 21:59, 22 March 2024 (UTC)[reply]
A short lead-in time but doable. Need to get cracking on all fronts quickly though. Cas Liber (talk · contribs) 19:29, 23 March 2024 (UTC)[reply]
+1 submissions pages/alerts should probably be up in the next couple of days. ~~ AirshipJungleman29 (talk) 23:45, 23 March 2024 (UTC)[reply]
Indeed! I will get on this. Aza24 (talk) 01:02, 24 March 2024 (UTC)[reply]

Great name

[edit]

Coretheapple (talk) 17:06, 25 March 2024 (UTC)[reply]

Agree Remsense 17:09, 25 March 2024 (UTC)[reply]

Question

[edit]

Are there any rules prohibiting the improvement of more than one article? ~~ AirshipJungleman29 (talk) 16:52, 28 March 2024 (UTC)[reply]

They do not. The more entries, the more joy. Last year, we had two editors working on multiple articles: Artem G on various mars rovers, and Phlsph7 on knowledge/education. If the articles differ significantly in coreness, we'll put more emphasis on the changes in the more core artice :). Conversely, you're also allowed to work together on an article, which we saw often in the earlier contests. —Femke 🐦 (talk) 17:49, 28 March 2024 (UTC)[reply]
(Good to know.) Remsense 17:50, 28 March 2024 (UTC)[reply]
Actually, I'll put that out there now: does anyone maybe want to work on something together as well? Throw some ideas out there! Remsense 12:39, 6 April 2024 (UTC)[reply]

Initial stats

[edit]

Good luck everyone. I've asked my script to generate some initial stats on all the articles. The total yearly pageviews for this group of article is over 24 million(!)25 million, which I don't think we've had in a while.

Article VIT 2023 views Words References Median Age
Mongol invasions and conquests 4 1/4M 1502 48 2001
History of North America 3 <100k 3904 31 2003
Classical Chinese 4 >100k 2326 22 1993
Modern philosophy 4 >100k 1796 21 2008
Nursing 3 >1/2M 9163 156 2015
Monarchy 3 >1/2M 4936 52 2016
Turkey 3 >8M 13746 684 2015
Wang Xizhi 4 <100k 731 7 2018
Renewable energy 3 >M 7729 355 2021
Federal Republic of Central America 5 >200k 1009 15 2004
Moscow 3 >M 15546 309 2017
Edward Oliver LeBlanc 5 <100k 252 5 2015
Independent music 5 150k 1940 28 2016
Call of Duty 5 3M 4931 100 2014
Pakistan 3 7M 14331 657 2011
Voltairine de Cleyre 5 <100k 1585 36 2003
Love 3 2M 5690 92 2003
Mind 2 1/4M 7687 239 2010
Night 4 >200k 1384 14 2017
Performing arts 2 1/4M 3896 23 2015
Antioch 4 1/2M 5945 109 2005
Library of Congress 4 1/2M 5696 132 2015

—Femke 🐦 (talk) 20:08, 14 April 2024 (UTC), 06:18, 15 April 2024 (UTC), 19:40, 17 April 2024 (UTC)[reply]

Interesting cohort of articles. Cas Liber (talk · contribs) 05:24, 16 April 2024 (UTC)[reply]
Indeed. Always a good sign for coreness when these single-word articles (Love, Mind, Night etc.) are present. Aza24 (talk) 03:55, 17 April 2024 (UTC)[reply]
FYI, some of the pageview statistics in the table look a little inaccurate. ~~ AirshipJungleman29 (talk) 22:56, 17 April 2024 (UTC)[reply]
I had forgotten to update the year for which it takes the pageviews (2022 instead of 2023). More effort to do last 12 months. Have updated the table with 2023 data now and quite a few articles changed categories. Of course, the million award should be taken with a grain of salt, as it's about the last 12 months if I recall correctly.
I think this link should lead you to the code. Not sure what people need to have access though. —Femke 🐦 (talk) 16:08, 18 April 2024 (UTC)[reply]

Requesting to withdraw

[edit]

Putting this here because I'm not sure how to "withdraw" from this. Before the contest started, I signed up with Call of Duty  5, which is an article I thought I would be able to improve in an adequate time frame. However, a third into the contest, and I do not think I will be able to properly improve the article, or at least do it justice. I also will be leaving the country in the last week and a half of the contest per a note I have left on my own talk page, which will limit my time. While I may improve this article one day on my own terms, I am requesting to withdraw from this years Core Contest. If this is not the place to ask to withdraw from, then I would like to know where the place to request that would be. λ NegativeMP1 06:15, 3 May 2024 (UTC)[reply]

Hey @NegativeMP1, no worries! Thanks for your initial interest and honest evaluation of your time. Anyone is allowed to withdraw at any time; we just ask that you move your entry in Wikipedia:The Core Contest/Entries to Wikipedia:The Core Contest/Entries#Withdrawn entries. Best – Aza24 (talk) 18:46, 3 May 2024 (UTC)[reply]

Payment issue

[edit]

So, I still have not been paid for last year's contest. I needed to register a new bank account that supported international payments, submitted the information to Karla Marte at wmuk at the beginning of May, and sent a reminder email earlier this week after hearing nothing back. She hasn't gotten back to me since. Did anyone else manage to get paid? (t · c) buidhe 14:43, 24 May 2024 (UTC)[reply]

Yes, I did, a week or 2 ago, but I'm in the UK. It's not good. Mind you, I've had similar issues with US universities and small payments for research interviews etc. For the amounts involved you'd think they could just mail banknotes. Johnbod (talk) 15:44, 24 May 2024 (UTC)[reply]
Similar problem for me. I sent them my banking information at the end of April and there hasn't been a transaction so far. Phlsph7 (talk) 15:49, 24 May 2024 (UTC)[reply]
Sorry to hear, but thank you all for reaching out. I'll follow up with WMUK to see if there has been a delay on their end, or if there's another reason. Aza24 (talk) 20:04, 24 May 2024 (UTC)[reply]
I gave up once it was clear that I had to divulge my personal information. There's a price where I'd trust the WMF with that, but it's higher than £35. Thebiguglyalien (talk) 02:08, 25 May 2024 (UTC)[reply]
Here in the UK I just got Amazon vouchers (fine by me), so only my email was needed. But these can't be sent/used internationally, by what seems to me an odd Amazon quirk. It's remarkably difficult/expensive to make small international payments of actual money - odd when debit & credit cards are so efficient across borders. Johnbod (talk) 02:20, 25 May 2024 (UTC)[reply]
I agree, having to disclose personal information is an issue. Given all the other issues encountered so far, it would probably be best to not offer monetary rewards in the future and use barnstars instead. While the idea is nice in principle, the current process is too cumbersome to be worthwhile. Phlsph7 (talk) 06:45, 25 May 2024 (UTC)[reply]
It wasn't a problem in previous years, afaik. Johnbod (talk) 03:41, 26 May 2024 (UTC)[reply]
I don't think it was either. We are currently waiting to hear back; if these issues persist, we'll look for alternative funding in the future. Aza24 (talk) 16:25, 26 May 2024 (UTC)[reply]
{{u|Aza24}} Thanks for reaching out, did you hear anything back ? (t · c) buidhe 19:37, 9 June 2024 (UTC)[reply]
We got an email this Tuesday saying they'd hope to follow up within the week. Apparently, the finance department is needed for a transfer like this .. Again, very sorry for the delay! —Femke 🐦 (talk) 19:53, 9 June 2024 (UTC)[reply]
Unfortunately, they have not followed up. I have sent another email (t · c) buidhe 02:27, 17 June 2024 (UTC)[reply]
I have also just claimed mine but I'm UK based so similarly was able to get an Amazon voucher without providing any personal details. Sammielh (talk) 11:10, 25 May 2024 (UTC)[reply]

Words and references added table

[edit]
Article VIT 2023 views Words References Words added Refs added
Classical Chinese 4 >100k 2050 20 -153 -2
Monarchy 3 >1/2M 4942 52 6 0
Turkey 3 >8M 11531 648 -2041 -37
Wang Xizhi 4 <100k 3850 145 3119 138
Renewable energy 3 >M 6699 318 -1027 -37
Federal Republic of Central America 5 >200k 9244 387 6574 291
Edward Oliver LeBlanc 5 <100k 2931 119 2679 114
Independent music 5 150k 1784 93 -66 67
Pakistan 3 7M 10800 661 -2181 12
Voltairine de Cleyre 5 <100k 12864 419 9692 333
Love 3 2M 7087 112 1358 19
Mind 2 1/4M 7535 175 -152 -64
Night 4 >200k 4184 212 2800 198
Performing arts 2 1/4M 3896 23 0 0
Human history 1 1/2M 9897 581 -385* -1*
Withdrawn entries
Library of Congress 4 1/2M 5696 132 0 0
Mongol invasions and conquests 4 1/4M 1502 48 0 0
Antioch 4 1/2M 5954 109 9 0
Nursing 3 >1/2M 3618 141 -5877 -26
History of North America 3 <100k 3890 31 -14 0
Modern philosophy 4 >100k 1796 21 0 0
Moscow 3 >M 15437 316 -109 7
Call of Duty 5 3M 4916 100 -14 0

* Manually estimated, so maybe not consistent Thanks everybody for your contributions! As always, a huge improvement in many of the articles here. There are a lot of entries this year, so we might take slightly longer than normal to come back to y'all. —Femke 🐦 (talk) 07:42, 1 June 2024 (UTC)[reply]

Thanks for the helpful statistics. For the number of references, I think you are counting reference tags, at least I get the same count as you (175) for the article Mind. This article uses citation bundles, meaning that each reference tag may contain several individual citations, by my count 441 citations in total. This will probably also be an issue for the article Human history, which has 581 bundles with a total of 658 individual citations by my count. Phlsph7 (talk) 08:20, 1 June 2024 (UTC)[reply]
That's good to know! I'm very much looking forward to reading both! —Femke 🐦 (talk) 19:08, 1 June 2024 (UTC)[reply]

Readability table

[edit]

More data. I absolutely love ChatGPT which wrote most of the code to get this. I tried to do it myself last year, but gave up after two hours. Now, it only took me 30 minutes :). It's the Flesch reading ease score, which usually gives a good rough indication of how difficult text is based on sentence length and word length. Of course, elements like paragraph length, section lenght, logic and structure all impact readability and understandability too, but these scores align quite well with my "bonus points" for clarity. Next year, I hope to track changes in readability over the contest.

My own experience with good reading scores for technical articles, is that you can reach >45 without having to compromise on content. Between 60 and 70 is considered good for a general audience, but I've never managed to reach that even after rewriting specifically for readability. —Femke 🐦 (talk) 17:52, 9 June 2024 (UTC)[reply]

Title Readability score Explanation
Mongol invasions and conquests 44 Difficult. Best understood by those with a university-level education.
History of North America 33 Difficult. Best understood by those with a university-level education.
Classical Chinese 24 Very difficult. Suitable for a very advanced level, such as postgraduate students.
Modern philosophy 19 Very difficult. Suitable for a very advanced level, such as postgraduate students.
Nursing 34 Difficult. Best understood by those with a university-level education.
Monarchy 29 Very difficult. Suitable for a very advanced level, such as postgraduate students.
Turkey 36 Difficult. Best understood by those with a university-level education.
Wang Xizhi 41 Difficult. Best understood by those with a university-level education.
Renewable energy 31 Difficult. Best understood by those with a university-level education.
Federal Republic of Central America 19 Very difficult. Suitable for a very advanced level, such as postgraduate students.
Moscow 41 Difficult. Best understood by those with a university-level education.
Edward Oliver LeBlanc 38 Difficult. Best understood by those with a university-level education.
Independent music 41 Difficult. Best understood by those with a university-level education.
Call of Duty 49 Difficult. Best understood by those with a university-level education.
Pakistan 25 Very difficult. Suitable for a very advanced level, such as postgraduate students.
Voltairine de Cleyre 38 Difficult. Best understood by those with a university-level education.
Love 35 Difficult. Best understood by those with a university-level education.
Mind 24 Very difficult. Suitable for a very advanced level, such as postgraduate students.
Night 54 Fairly difficult. Requires reading comprehension of a higher secondary school level.
Performing arts 38 Difficult. Best understood by those with a university-level education.
Antioch 42 Difficult. Best understood by those with a university-level education.
Library of Congress 34 Difficult. Best understood by those with a university-level education.
Human history 30 Difficult. Best understood by those with a university-level education.
Btw, I remember doing a version of a reading test for the lead of Turkey. I think when you introduce foreign names (such as "Hattians"), it might change the reading comprehension score. But sometimes foreign names are necessary. Bogazicili (talk) 07:21, 10 June 2024 (UTC)[reply]
I introduced the topic of the Flesch reading ease score a while back at the good article and featured article talk pages. There was strong backlash against using it to assess the prose quality of Wikipedia articles, see here and here. So making this score a central part of the core contest assessment process would be a controversial move. Phlsph7 (talk) 07:36, 10 June 2024 (UTC)[reply]
Although most of my articles including last year's entry are not very technical and probably score OK by this metric, I don't agree that it's a good evaluation of prose quality. Besides the issues discussed above, it favors people who work on easier and less technical topics since it's harder to get a good readability score for those more technical articles. (t · c) buidhe 13:43, 10 June 2024 (UTC)[reply]
The score definitely has limitations. In the past, I've seen sentence-by-sentence rewrites with these tools, with different levels of success. Often, the flow gets lost as sentences all become of similar length.
I find it a very useful metric on an article level though. If you write about modern philosophy, a WP:ONEDOWN audience could be final-year secondary school students / early uni students. On the other hand, an article like Turkey may be of interest to 15-year olds, some of whom would have English as a second language. A simple score can give you an idea if you're roughtly on the right track for your target audience.
Usually, articles who do a great job explaining stuff score highly, and I want to make one metric of this visible in the table. Similarly, when we give points to sourcing improvements in an article, the number of newly added sources is a very rough metric. A lot of people improve articles by removing bad sources instead. —Femke 🐦 (talk) 18:50, 10 June 2024 (UTC)[reply]
To the extent that it is possible, the assessment of article quality should follow community consensus. For whatever good or bad reasons, there seems to be wide consensus (in the discussions linked above) against using the Flesch reading ease score to assess prose quality. I'm not aware of a similar consensus regarding the other metrics you use.
If you are looking for more metrics as rough guidelines to help with the assessment, I think there are less controversial alternatives that more closely reflect wikipedia policies and guidelines. One candidate would be the existence of maintenance tags in the article, like "Multiple issues", "Original research", and "More citations needed". Another approach is to look for paragraphs that require references but don't have them. This is the approach used at WP:SWEEPS2023. You could use this script for both metrics, with the caveat that the count of unreferenced paragraphs has to be manually confirmed since the result is not accurate for all articles. Phlsph7 (talk) 11:57, 11 June 2024 (UTC)[reply]
The work for the core contest is usually of such high quality (at least the ones vying for the top positions) that the WP:Good article criteria provide more of a guide to how I judge than things like maintenance tags. 1a of the GA criteria links to Wikipedia:Make technical articles understandable (MTAU), which has guidance on readability. I use five categories for scoring points (we all have a different system). On of the categories is accessibility & MTAU. So this includes things like alts, {{lang}} templates and such as well writing understandable text.
One of the reasons I wanted to get some rough objective metric in this area is to avoid my own biases. I find it much easier to read about technical articles in science than in history or literature. This gives me at least some idea of whether I might need to take a deeper (human!) look at an article. —Femke 🐦 (talk) 19:53, 11 June 2024 (UTC)[reply]
I'd much prefer human judges' biases than the biases that Flesch-Kincaid brings. Given the diversity of backgrounds that the judges have, their leanings should roughly cancel out. Humans can evaluate understandability far more holistically than just counting syllables and sentence lengths. Humans are able to assess: Does the article define its terms? Does it use terms consistently? Does it give terms the same meaning that ordinary people give them? Does it have a logical flow?
Just because something creates numbers doesn't make it objective. When a system gives issues like the above zero weight because they're hard to measure, that's a form of bias. Clayoquot (talk | contribs) 21:57, 11 June 2024 (UTC)[reply]
Article Unreferenced paragraphs Maintenance tags
Classical Chinese 10 1x More citations needed
Edward Oliver LeBlanc 0
Federal Republic of Central America 0
Human history 0
Independent music 0
Library of Congress 13 1x citation needed
Love 12 1x Unreferenced section, 1x Empty section, 7x clarification needed, 3x specify, 1x how?, 3x citation needed, 3x relevant?, 3x page needed
Mind 0
Monarchy 38 2x Multiple issues, 1x Confusing, 2x More citations needed, 2x Original research, 1x Unreferenced section, 1x citation needed
Night 0
Pakistan 2 2x dead link
Performing arts 36 1x More citations needed, 3x Expand section, 10x citation needed, 1x page needed
Renewable energy 1 1x contradictory, 1x permanent dead link
Turkey 2 1x citation needed
Voltairine de Cleyre 0
Wang Xizhi 0

2024 Winners!

[edit]

Hi all, the 2024 Winners have been announced. [1] Included below for convenience's sake:

Herewith is Aza24 (talk · contribs)'s announcement for the April/May 2024 Core Contest
  • First place (and a prize of £100) goes to Rjjiii (talk · contribs) for a complete rewrite of Night. Tackling such broad articles can be challenging, but Rjjiii approached this head on, with a vast reworking of sourcing and prose. In particular, we commend its readability and global perspective.
  • A second place (and a prize of £80) goes to Phlsph7 (talk · contribs) for improving both Mind and Human history. The former received a complete rewrite with monumental reworking of both sources and coverage. The latter gained a series of crucial sourcing and prose improvements, which have pushed this Vital level-1 article to GA-Standard.
  • A tie for third place (and a prize of £35) goes to DanCherek (talk · contribs) for improving Wang Xizhi. What was once a C-class article is now among the most outstanding Chinese biographies on Wikipedia.
  • A tie for third place (and a prize of £35) goes to SheriffIsInTown for improving Pakistan. The Vital level-3 article has seen crucial sourcing advances, major updating and reorganization, as well as a heavy trim of extraneous information.

The panel of judges was Femke (talk · contribs), Aza24 (talk · contribs) and Casliber (talk · contribs

Congratulations to all the winners, and thank you to all of the participants! WMUK will reach out shortly. – Aza24 (talk) 23:08, 14 June 2024 (UTC)[reply]

Oh wow, many thanks! I'm checking other entries out now, and a lot of them are major improvements. Rjjiii (talk) 16:34, 15 June 2024 (UTC)[reply]
I think I said this before but anyway, am pleased this hasn't suffered founder's syndrome as Femke and Aza24 have done the heavy lifting the past few iterations. Very happy to see this live on. Reading the articles is the reward. Cas Liber (talk · contribs) 02:41, 16 June 2024 (UTC)[reply]