THE STONE AGE IS BACK - HCI Effects on Recommender Systems

Yuval Dan-Gur

2011

Abstract

We addressed HCI and social aspects of recommender systems by studying the uncharted domain of the advising group and the user's control over it. We conducted a longitudinal field study in which, for two years, our research tool, QSIA (which means QUESTION in Hebrew language), was free for use on the web and was adopted by various institutions and classes of heterogeneous learning domains. QSIA enables the user to be involved in the formation of the advising group. The user was free to choose advising group for each recommendation sought, while the default choice is the common 'neighbors group'. QSIA yielded high internal validity of acceptance and rejection ratios due to the immediate "usage actions" that followed the recommendation outputs. Although the objective amount of data in QSIA's logs are fairly large (31,000 records, 10,000 items, 3,000 users), the relevant figures for analysis of recommendations are modest – 895 recommendations seeking records, accepted from 108 users, 3,000 rankings by 300 users, and 1,043 "usage actions" by 51 users. Our findings suggest that the perceived quality of the recommendations (measured in terms of "usage actions") is 14% to 24% higher (α≤0.05) for user-controlled 'friends group' than for machine-computed 'neighbors group'. We almost felt that the ancient tribal friends "revived" in modern Information Systems.

References

  1. Bacon, L. D. (1995). Linking attitudes and behavior - summary of literature. Paper presented at the American Marketing Association/Edison Electric Institute Conference, Chicago, Il.
  2. Barak, M. & Rafaeli, S. (2004). Online question-posing and peer-assessment as means for web-based knowledge sharing in learning, International Journal of Human-Computer Studies, 61(1), 84-103.
  3. Bradley, J. V. (1968). Distribution-Free Statistical Tests. New Jersey: Prentice-Hall.
  4. Breese, J. S., Heckerman, D., & Kadie, C. (1998). Empirical analysis of predictive Algorithms for collaborative filtering. Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence. Madison, 43-52.
  5. Cosley, D., Lam, S. K., Albert, I., Konstan, A. J. & Riedl, J. (2003). Is seeing believing?: how recommender system interfaces affect users' opinions. Proceedings of the SIGCHI conference on Human factors in computing systems, Ft. Lauderdale, Florida, 5(1), 585- 592. New York: ACM Press.
  6. Festinger, L. (1954). A theory of social comparison processes. Human Relations, 7, 114-140.
  7. Fisher, D., Hildrum, K., Hong, J., Newman, M., Thomas, M., & Vuduc, R. (2000). SWAMI: A framework for collaborative filtering algorithm development and evaluation, Research and Development in Information Retrieval, 366-368.
  8. Freedman, S. G. (1998). Asking software to recommend a good book. The New York Times, 1998, June 20.
  9. Gefen, D. (2004). What Makes ERP Implementation Relationships Worthwhile: Linking Trust Mechanisms and ERP Usefulness, Journal of Management Information Systems, 21(1), 275-301.
  10. Goldberg, K., Roeder, T., Gupta, D., & Perkins, C. (2000). Eigentaste: A Constant Time Collaborative Filtering Algorithm (Technical Report M00/41).
  11. Herlocker, J. (2000). Understanding and improving automated collaborative filtering systems. Unpublished Ph.D. dissertation, UMI Order Number: AAI9983577, University of Minnesota.
  12. Herlocker, J., Konstan, J., Borchers, A., & Riedl, J. (1999). An Algorithmic Framework for Performing Collaborative Filtering, Research and Development in Information Retrieval (pp. 230-237).
  13. Herlocker, J., Konstan, A. J., Terveen, G. L. & Riedl, J. (2004). Transactions on Information Systems. Communications of the ACM, 22(1), 5-53. New York: ACM Press.
  14. Hosmer, D. W. & Lemeshow, S. (2000). Applied Logistic Regression. New York: Wiley.
  15. Kalman, Y. M. & Rafaeli, S. (2005). Email Chronemics: Unobtrusive Profiling of Response Times, Proceedings of the 38th International Conference on System Sciences, HICSS 38, 2005. Big Island, Hawaii. Ralph H. Sprague, (Ed.), 108. Available online: http://sheizaf.rafaeli.net/publications/KalmanRafaeliC hronemics2005Hicss38.pdf
  16. Karypis, G. (2000). Evaluation of Item-Based Top-N recommendation algorithms (CS-TR-00-46). Minneapolis: University of Minnesota, Department of Computer Science and Army HPC Research Center.
  17. Kerlinger, F. N. (1986). Foundations of behavioral research. Orlando: Holt, Rinehart and Winston, Inc.
  18. Konstan, J., Miller, B. N., Malt, D., Herlocker, J., Gordon, L. R., & Riedl, J. (1997). GroupLens: applying collaborative filtering to Usenet news. Communications of the ACM, 40(3), 77-87.
  19. Konstan, J., & Riedl, J. (1999). Research Resources for Recommender Systems. Paper presented at the ACM SIGIR: Workshop on Recommender SystemsAlgorithms and Evaluation, University of California, Berkeley.
  20. Minard, R. D. (1952). Race relations in the Pocahontas Coal Field. Journal of Social Issues, 8, 29-44.
  21. Moon, Y. (1998). The Effects of Distance in Local versus Remote Human-Computer Interaction. In proceedings of the CHI 9878, Los Angeles, CA. 103-108.
  22. Moon, Y., & Nass, C. (1998). Are computers scapegoats? Attributions of responsibility in human-computer interaction. International Journal of Human-Computer Studies, 49, 79-94.
  23. Pescovitz, D. (2000). Accounting for taste. Scientific American, June 2000.
  24. Rafaeli, S., Barak, M., Dan-Gur, Y. & Toch, E. (2003). Knowledge sharing and online assessment, E-Society Proceedings of the 2003 IADIS conference IADIS eSociety 2003, 257-266.
  25. Rafaeli, S., Barak, M., Dan-Gur, Y. and Toch, E. (2004). QSIA - A web-based environment for learning, assessing and knowledge sharing in communities, Computers and Education, 43(3), 273-289.
  26. Rafaeli, S., Dan-Gur, Y. & Barak, M. (2005). Finding friends among recommenders: Social and "Black-Box" recommender systems", International Journal of Distance Education Technologies (IJDET), Special Issue on Knowledge Management Technologies for Elearning: Exploiting Knowledge Flows and Knowledge Networks for Learning, 3(2), 30-47.
  27. Rafaeli, S. & Tractinsky, N. (1991). Time in computerized tests: A multi-trait multi-method investigation of general knowledge and mathematical reasoning in online examinations. Computers in Human Behavior, 7(2), 123-142.
  28. Sarwar, B., Karypis, G., Konstan, J., & Riedl, J. (2001). Item-Based collaborative filtering recommendation algorithms. In Proceedings of the 10th International World Wide Web Conference (WWW10), Hong Kong. Available: http://citeseer.ist.psu.edu/sarwar01item based.html.
  29. Webb, E. Campbell, D. Schwartz, R. & Sechrest, L. (1966). Unobtrusive measures: Nonreactive research in the social sciences. Chicago: Rand McNally.
  30. Wicker, A. W. (1969). Attitudes versus actions: The relation of verbal and overt behavioral responses to attitude objects. Journal of Social Issues, 25, 41-78.
Download


Paper Citation


in Harvard Style

Dan-Gur Y. (2011). THE STONE AGE IS BACK - HCI Effects on Recommender Systems . In Proceedings of the 7th International Conference on Web Information Systems and Technologies - Volume 1: WEBIST, ISBN 978-989-8425-51-5, pages 263-270. DOI: 10.5220/0003303902630270


in Bibtex Style

@conference{webist11,
author={Yuval Dan-Gur},
title={THE STONE AGE IS BACK - HCI Effects on Recommender Systems },
booktitle={Proceedings of the 7th International Conference on Web Information Systems and Technologies - Volume 1: WEBIST,},
year={2011},
pages={263-270},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0003303902630270},
isbn={978-989-8425-51-5},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 7th International Conference on Web Information Systems and Technologies - Volume 1: WEBIST,
TI - THE STONE AGE IS BACK - HCI Effects on Recommender Systems
SN - 978-989-8425-51-5
AU - Dan-Gur Y.
PY - 2011
SP - 263
EP - 270
DO - 10.5220/0003303902630270