Reliable Intelligence Identification on Vietnamese SNSs (ReINTEL) - 2020

Organized by reml - Current server time: Dec. 23, 2024, 9:57 a.m. UTC

Current

Private Test - Legacy
Nov. 28, 2020, midnight UTC

End

Competition Ends
Never

Our website

Important dates

  • Sep 10, 2020: Registration open
  • Oct 15, 2020: Registration closed
  • Oct 21, 2020: Challenge started (via CodaLab.Org)
  • Nov 15, 2020: Team merger deadline
  • Nov 30, 2020: Final results on private test.
  • Dec 01, 2020: Announce top 3 teams to submit technical reports.
  • Dec 10, 2020: Deadline for top 3 teams to submit technical reports.
  • Dec 12, 2020: If any top teams did not submit their reports, follow-up teams can submit and take their places (follow-up teams are recommended to write their reports in advance and submit by this deadline).
  • Dec 15, 2020: Final winners announcement.
  • Dec 18, 2020:Result presentation and award ceremony (workshop day).

This challenge aims to identify a piece of information shared on social network sites (SNSs), is reliable or unreliable. With the blazing-fast spurt of SNSs, e.g., Facebook, Zalo, or Lotus, there are approximately 65 million Vietnamese users on board with the annual growth of 2.7 million in the recent year, as reported by the Digital 2020 [6]. SNSs become essential means for users to not only connect friends but also freely create, share diverse information [2, 5], i.e., news. Within freedom, a number of users tend to spread unreliable information for their personal purposes affecting the online society. Detecting whether news spreading in SNSs is reliable or unreliable has gained significant attention recently [1, 3, 4]. Therefore, this shared task targets identifying shared information in Vietnamese SNSs. It provides an opportunity for participants who are interested in the problem, to contribute their knowledge to improve the online society for social good.

References

[1] Ruchansky, N., Seo, S., & Liu, Y. (2017, November). Csi: A hybrid deep model for fake news detection. In Proceedings of the 2017 ACM on Conference on Information and Knowledge Management (pp. 797-806).

[2] Shu, K., Sliva, A., Wang, S., Tang, J., & Liu, H. (2017). Fake news detection on social media: A data mining perspective. ACM SIGKDD explorations newsletter, 19(1), 22-36.

[3] Shu, K., Cui, L., Wang, S., Lee, D., & Liu, H. (2019, July). defend: Explainable fake news detection. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 395-4 05.

[4] Shu, K., Wang, S., & Liu, H. (2019, January). Beyond news contents: The role of social context for fake news detection. In Proceedings of the Twelfth ACM International Conference on Web Search and Data Mining, pp. 312-320.

[5] Zhou, X., Zafarani, R., Shu, K., & Liu, H. (2019, January). Fake news: Fundamental theories, detection strategies and challenges. In Proceedings of the twelfth ACM international conference on web search and data mining (pp. 836-837).

[6] Digital 2020 - Global Digital Overview, https://wearesocial.com/digital-2020

Result submission 

Participants must submit a .zip file which contains a results.csv. The results.csv file must contain the result in the same order as the testing set in the following format. Please do not include header in the csv file: 

  • id1, label probability 1

  • Id2, label probability 2

Evaluation Metric

The submission will be evaluated with ground-truth labels using ROC-AUC metric

General rules

  • Right to cancel, modify, or disqualify. The Competition Organizer reserves the right at its sole discretion to terminate, modify, or suspend the competition.

  • By submitting results to this competition, you consent to the public release of your scores at the Competition workshop and in the associated proceedings, at the task organizers' discretion. Scores may include but are not limited to, automatic and manual quantitative judgments, qualitative judgments, and such other metrics as the task organizers see fit. You accept that the ultimate decision of metric choice and score value is that of the task organizers.

  • By joining the competition, you affirm and acknowledge that you agree to comply with applicable laws and regulations, and you may not infringe upon any copyrights, intellectual property, or patent of another party for the software you develop in the course of the competition, and will not breach of any applicable laws and regulations related to export control and data privacy and protection

  • Prizes are subject to the Competition Organizer’s review and verification of the entrant’s eligibility and compliance with these rules as well as the compliance of the winning submissions with the submission requirements.

  • Participants grant to the Competition Organizer the right to use your winning submissions and the source code and data created for and used to generate the submission for any purpose whatsoever and without further approval.

Eligibility

  • Each participant must create a CodaLab account to submit their solution for the competition. Only one account per user is allowed.

  • The competition is public, but the Competition Organizer may elect to disallow participation according to its own considerations.

  • The Competition Organizer reserves the right to disqualify any entrant from the competition if, in the Competition Organizer’s sole discretion, it reasonably believes that the entrant has attempted to undermine the legitimate operation of the competition through cheating, deception, or other unfair playing practices.

Team

  • Participants are allowed to form teams. The maximum of the number of participants on the team is up to 5. 

  • You may not participate in more than one team. Each team member must be a single individual operating a separate CodaLab account. 

  • Team mergers are allowed and can be performed by the team leader. Team merger requests will not be permitted after the "Team merger deadline".  

  • In order to merge, the combined team must have a total submission count less than or equal to the maximum allowed for a single team as of the merge date. The maximum allowed is the number of submissions per day per phase multiplied by the number of days the competition has been running. 

  • The organizers don’t provide any assistance regarding team mergers.

 

Submission

  • Maximum number of submissions in each phase:

    • Phase 1 - Warm Up: 10 submissions / day / team
    • Phase 2 - Public Test: 10 submissions / day / team
    • Phase 3 - Private Test: 5 submissions / day / team
  • Submissions are void if they are in whole or part illegible, incomplete, damaged, altered, counterfeit, obtained through fraudulent means, or late. The Competition Organizer reserves the right, in its sole discretion, to disqualify any entrant who makes a submission that does not adhere to all requirements.

Data

By downloading or by accessing the data provided by the Competition Organizer in any manner you agree to the following terms:

  • You will not distribute the data except for the purpose of non-commercial and academic-research.

  • You will not distribute, copy, reproduce, disclose, assign, sublicense, embed, host, transfer, sell, trade, or resell any portion of the data provided by the Competition Organizer to any third party for any purpose.

  • The data must not be used for providing surveillance, analyses or research that isolates a group of individuals or any single individual for any unlawful or discriminatory purpose.

  • You accept full responsibility for your use of the data and shall defend and indemnify the Competition Organizer, against any and all claims arising from your use of the data.

No files have been added for this competition yet.

Private Test - Legacy

Start: Nov. 28, 2020, midnight

Competition Ends

Never

You must be logged in to participate in competitions.

Sign In
# Username Score
1 SamsonPh 0.94
2 khoibui_noti 0.87
3 khaidoan25 0.79