Tuesday, January 20, 2015

GOODREADS

Goodreads is probably my favorite website, and besides my email account and facebook, I visit it more regularly than any other. Even more than I check this blog or my poetry site, or any of my favorite blogs (perhaps the bloggess getting the most regular viewing from me). I constantly suggest to friends and acquaintances that they would profit by joining Goodreads, and I frequently post links to specific reviews. I daily hunt through the site in search of new books to pick up, and I have been often rewarded in discovering a new writer or work I probably never would have seen at the local library or bookshop. I seldom go longer than a day without posting a review, adding another to-read, or sending a note to a GR friend. Every week I look forward to reviews by certain friends, and I anticipate special entries, such as Karen's hilarious AIFAFs (as well as her often delicious reviews). I have met a few really nice people (though I keep my "friending" to a minimum, basically close to 10% of the books I have read). Although I don't care if people do it, I don't see myself as a "friend" accumulator, and I don't understand how people would have time to read all the cross posts that follow having so many. I know that the site is as much about promoting sales, especially for Amazon, and for giving emerging writers a chance to market their books and find a readership. One rule I follow is I never friend a writer (although if I were ever asked to do so by a favored author, I might bend that rule). I like listopia.

Still, there are many aspects of Goodreads I don't like so much, both in how some people abuse the system and what seems to be laziness on the part of the administrators. For instance, I think people should be ranked only by the number of books they have actually read, and even better, reviewed (at least rated). Some people seemingly mark as "to-read" every book that swims across their path. One "user" was ranked for having marked as to-read more than 3000 books in a week. What does this tell me other than they have a ton of time on their hands and a rapid-firing clicking finger. In another measuring stick, members claim to have read certain numbers of books. One claimed 324 in a week. Really? I doubt it, even if they are a speed reader. Not surprisingly, they didn't post a single review. On the other hand, I do like when people who actually review books (at least fifty characters) get ranked for having done so. How some people claim more than an average ten books reviewed a day is a bit beyond me however, and you would think admin would check the veracity of these reports. Some people review the short stories, of course. Likewise, there should be a different way to assess what is truly a "best" review, other than the number of people who "liked" that review. Clearly there is nothing wrong with a review such as " wow. wow. WOW. I went into this book not expecting to like it. I'm not sure why, but this book completely surprised me!," but does anyone truly believe this was the best review of a book this week? Come on.

The system of liking is too permissive. Obviously people with large friendships will accumulate the largest number of likes for their reviews, especially those reviewers who like to post the cutesy videos and memes, and I guess there is no reason they shouldn't be allowed to do so, as even I find some of them hilarious. And I know I like it very much when anyone takes the time to "like" one of my reviews (although, strangely, there have been a few times one of my reviews was "liked" even before I posted one.)

Whenever possible, when a book is posted on a list, say, when you click on it, it should always go first to the hardback first edition, not to a paperback or an ebook, or any other variation or edition. Whenever possible.

Trivia questions should be sent only to people who mark that book as read.

I don't know if it is possible, but it sure would be cool if an algorithm could be developed which accurately compared reader compatibility. I think they come close. Wouldn't it be cool if, say, after two people compare 100 or more of the same books and had a 90% similarity, the system would send you a note to check that person out?

What I'm advocating for, I guess, it a stricter system of filtering participation, so that it gives a more meaningful measurement.

No comments:

Post a Comment