Difference between revisions of "Google Code In 2017"
|Line 45:||Line 45:|
= Tasks =
= Tasks =
= Application =
= Application =
Revision as of 01:18, 27 October 2017
Central page for administering OSGeo participation in Google Code In 2017
- Google Code In program has been officially announced
- Google Code In main page
- How Google Code-in Works
- Examples of Google Code-in Tasks
- Logo, flyer and presentation
- Roles and responsibilities of Students, Mentors and Admins
- Full timeline
- Oct 9 2017: organizations can start drafting application to be mentoring organizations
- Oct 24 2017: application deadline (exact time) for mentoring organizations (we should have mentors and tasks by then - "between 150 to 500 tasks")
- Oct 26 2017: Mentoring organizations announced
- Nov 28 2017: Contest opens for entries by student participants
- Jan 15 2018: Deadline for students to claim new tasks
- Jan 17 2018: All student work must be submitted; contest ends
- Jan 18 2018: Mentoring organizations complete all evaluations of students’ work
- Jan 31 2018: Grand Prize Winners and Finalists announced
- June (exact dates TBD): Grand Prize Winner’s Trip
- IRC-channel: #osgeo-gsoc channel on freenode.net
- how to connect: choose your favorite IRC client, or go directly through browser with webchat
- Margherita Di Leo (IRC nickname: "madi")
- Jeff McKenna (IRC nickname: "jmckenna")
- Florin-Daniel Cioloboc
- Helmut Kudrnovsky
Interested mentors / volunteers
- fill out the Google form mentioned at Google Code In 2017 Mentors
- will be presented when the contest starts
- our OSGeo application as a mentor organization, see: Google Code In 2017 Application
Kickoff (virtual) meeting
- 2017-09-25 at 20.00 UTC
- On IRC in the #osgeo-gsoc channel on irc.freenode.net (connect directly in your browser via webchat)
- discuss ideas for tasks for the students
- plan to share mentor invites to all OSGeo projects
- Collect questions to discuss with other mentor orgs at mentor summit
Adrien Destugues (PulkoMandy) mentor for Haiku has replied to some of our questions:
Q: It is not clear to me if it's possible to use more than one bug tracker or we must use the osgeo tracker for all the projects
A: You don't need a single bugtracker. You will have to import your tasks into the dedicated GCI website, and student work from that. The goal is that they learn how to find their way in your project and GSoC in general. In the case of Haiku, some tasks involve the OS itself (with a dedicated bugtracker), some involve contributing to existing projects (on github, for example), and some tasks are about writing an application from scratch so there isn't even a bugtracker to start with
Q: The how it works page doesn't mention how the word of this gets out to students, or does Google handle that?? How is promotion to students handled??
A: Google handles some of it, students from past years handle some of it, and you can have a task like "make a 20 minute talk about our project and GCI for other students around you" so the students help you spread the word. Of course you can also do your own communication.
Q: Discuss the opportunity to create "virtual environments" for the students to play with, and eventually come up with a patch? For example, for tasks that would otherwise require access to sensitive systems, e.g. SAC, web pages, etc..
A: About setting up the environment for building the project: if that part of your project is really complex, providing a preset environment might be a good idea. We went a different way: GCI has "beginner" tasks which are things the students should do first before moving on to other tasks in the project. In our cases, these were something like "get the source, compile it, and add your name to the about box" or even "get the Haiku operating system running, open a text document and put your name in it, then take a screenshot". This allows the students to get points for just getting started with these things. Our documentation tasks also included tasks to write or update pages about setting up the development environment.
Q: Is it necessary for the students to pick from different "types" of tasks or they can decide to complete only one type?
A: The student can mix types of tasks as they wish. At Haiku we are unfortunately putting a lot more coding tasks than anything else, mostly because that's what we can mentor best.
Q: When is the deadline to prepare the tasks: Oct 9th (Open source organizations can apply to be mentoring organizations) or Nov 28th (Contest opens for entries by student participants)? What is the minimum number of tasks we need to get ready by the deadline?
A: The deadline for tasks is, AFAIK, just before the contest opens to students. In fact you can continue adding new tasks to the contest after it opens and all the way until january, to keep the students busy. However, be prepared to have a few hundred tasks ready when the contest opens.
Q: How many mentors do other organizations have (to be on the safe side and not underestimate the work load)?
A: I don't remember the exact number of mentors, but I'd say about 10 to 12 is a minimum. Expect hundreds of students trying to complete tasks, lot of activity on your IRC channel and other communication mediums. It always feels strange to us when things come back to their normal quiet state after GCI ends.I have started to work on a tool to edit Haiku tasks. You may not find it of much use as it runs only in the Haiku operating system. However, alongside it is a JSON file with our tasks from the 2016 edition. This can give you an idea of the number of tasks and their contents: https://github.com/pulkomandy/gcitool
- Add yours
Insights from other orgs
Tim Abbot from the Zulip project says:
The Zulip project did GCI for the first time last year, and we learned basically everything we needed to know at the mentor summit GCI session. If you're spending time on preparation in advance of that, I'd encourage you to focus on a few things: * Recruiting volunteers to help the students; we had over 30 people organized in 4-hour shifts to provide continuous coverage. * Writing really good developer onboarding documentation, and testing it on inexperienced programmers. * Creating a few really polished starter tasks (e.g. we had one that was basically "figure out how to do manual and automatic testing in Zulip, including writing a simple unit test" * Planning out some larger categories of tasks (e.g. where you want 15 similar things done) that you can write really good guides for.