Asteroid Zoo Talk

How about giving us scores?

  • Devilstower by Devilstower

    I was showing off the site to a friend's children, both of whom have spent hours on similar projects, and the first thing that struck them was that there was no "score," no feedback on how much time or effort they'd put in.

    While it might not be possible to say "you've discovered XX asteroids" because they haven't been validated, you couldn't certainly say something like "you've seen XXX image sets" and / or "you've spotted XX known asteroids, which is XX% of those in images you've looked at." Either would serve to give a target for users.

    Posted

  • planetaryscience by planetaryscience

    well you can view how many classifications you've made at https://www.zooniverse.org/projects/current but you have to have an account.

    Posted

  • CTidwell3 by CTidwell3

    The closest thing there is to it is if you click on the white circle in the upper right corner next to you name. You will need to log in to the zooniverse site, but on that screen it will tell you how many submissions you have made to many different zooniverse projects, including the count for asteroidzoo.

    I agree that at a minimum this count should show up on the asteroid zoo site somewhere as well, since it is not obvious that you can get a count by clicking on the white circle.

    Posted

  • Devilstower by Devilstower

    As it happens, I design dashboards and user interfaces for a living. One thing I can tell you is that feedback makes all the difference. Want someone to get better a job? Give them the feedback they need as quickly as possible so they can learn and improve. Want someone to be enthusiastic about a job? Give them targets to shoot for and a sense of how they are performing relative to others.

    Unfortunately, the current interface provides little of either.

    Posted

  • Dr.Asteroid by Dr.Asteroid scientist, admin

    Thank you for the feedback. We're trying to figure out how to give better feedback - but since we have to validate both in number of people who click on a particular asteroid and then submit through the minor planet center - that will take some time since it has to be done correctly.

    Do you have a suggestion for something else that would be meaningful?

    Posted

  • planetaryscience by planetaryscience

    Well the 'you're the first to view this set of images' is certainly helpful. It makes people looking at the image put more effort in, as they would be the first to spot any asteroids they would find.

    This may not be exactly what others are looking for, but a list of top classifiers of the month or so might be fun, however it might urge people to make quicker, less-detailed observations of individual frames.

    Posted

  • Chipuk by Chipuk

    This is an excellent thread and some good ideas. Thinking of some more, how about having an accuracy ranking for each user, a star/class system so that each person moves up a class after analysing x many images or identifies x many artefacts and finally how about a database which records what others users have identified on a certain set of results, which would enable you to see how you are doing

    Posted

  • djsato by djsato admin in response to Chipuk's comment.

    Based on past projects, we’ve found that introducing gaming elements (such as scores, badges, etc.) requires a certain amount of care since they can shift users’ motivation from the science itself, to only attaining high scores or collecting badges, regardless of the science. While there's plenty of literature and case studies showing that adding game-like characteristics to tasks increases user engagement, Zooniverse projects are faced with the caveat that if a game is simply not entertaining enough, then the science suffers as well. In short, making successful games is not easy and we’d rather present the science as-is. This topic of “gamification" has come up frequently within Zooniverse, and while we’re not opposed to it, we find that the direct approach of motivating users with doing REAL science has worked best.

    Full disclosure: I’m a web developer at Zooniverse.

    Posted

  • Devilstower by Devilstower

    I'd say that, in this project, you have two leading success indicators -- the number of image sets viewed, and the number of candidates nominated -- and one key performance indicator, the percentage of known asteroids identified. Certainly, there are dangers in incentivizing behaviors through "gamification." Believe me, it's a very big issue in an industrial setting, where scoring designed to promote productivity can lead to unsafe behavior. But in this case, I think if you stick to these points, you'll be essentially in the clear.

    Use percentage of known asteroids identified as the primary factor, with the number of image sets viewed as a contributing factor. Punt the number of candidates identified to the side.

    So you end up with a value that's like "45% of known identified / 1000 image sets viewed --- 42 candidate objects identified."

    You could use that as feedback to individual users, or put it on a scoreboard for everyone to see, without promoting sloppy work or interfering with the process. If marking of artifacts is also something you want to promote, you could add that in s way similar to the identification of candidates,

    Posted

  • nicro46 by nicro46

    I travel a thousand viewing frame and I'm still learning, but today in 20 minutes of work, I realized that almost a set of 20 images is repeated several times in succession. with obvious loss of time. What is the reason?

    Posted

  • CTidwell3 by CTidwell3

    It's being looked into:

    http://talk.asteroidzoo.org/#/boards/BAZ0000002/discussions/DAZ0000349

    http://talk.asteroidzoo.org/#/boards/BAZ0000003/discussions/DAZ000032g

    Posted