Korean Academic Society of Business Administration
[ Article ]
korean management review - Vol. 54, No. 6, pp.1877-1904
ISSN: 1226-1874 (Print)
Print publication date 31 Dec 2025
Received 26 Jun 2025 Revised 07 Sep 2025 Accepted 16 Oct 2025
DOI: https://doi.org/10.17287/kmr.2025.54.6.1877

AI 동료 수용성 요인에 관한 연구: 직무 특성의 조절효과와 산업 특성에 따른 차이를 중심으로

양재용 ; 박광태
(주저자) 한양대학교 산업융합학부
(교신저자) 고려대학교 경영대학
A Study on the Acceptance Factors of AI Colleagues: Focusing on the Moderating Effect of Job Characteristics and Differences according to Industrial Characteristics
Jae-Yong Yang ; Kwangtae Park
(First Author) School of Interdisciplinary Industrial Studies, Hanyang University jyyang@hanyang.ac.kr
(Corresponding Author) Korea University Business School ktpark@korea.ac.kr


Copyright 2025 THE KOREAN ACADEMIC SOCIETY OF BUSINESS ADMINISTRATION
This is an open access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted, distribution, and reproduction in any medium, provided the original work is properly cited.

초록

본 연구는 AI 로봇에 대해서 인간 작업자가 느낄 수 있는 수용성 요인을 파악하고 직무 특성의 조절효과와 산업 특성에 따른 수용성 수준의 차이를 분석하는 것을 목적으로 한다. 이와 관련하여 기술수용모델, 알고리즘 회피 이론, 인간-로봇 상호작용 이론, 사회적 존재 이론, 직무-기술 적합성 이론 등 학제 간 이론적 논의를 통합적으로 고찰하여 측정모형과 측정지표를 개발하였다. AI 동료에 대한 수용성 요인으로는 신뢰도, 공정성 인식, 외형적 유사성, 감정적 거리감을 측정지표로 도출하였다. 국내 산업체 종사자들을 대상으로 설문조사를 실시한 결과, 공정성 인식과 감정적 거리감이 수용성에 유의한 영향을 미치는 것으로 나타났으며, 직무 특성이 신뢰도와 공정성 인식에 조절효과가 있는 것으로 나타났다. 또한 산업 특성의 측면에서 금융업은 제조업과 공공기관보다 높은 수용성을 보였고, 서비스업은 제조업보다 높은 수용성을 보였다. 본 연구는 AI 로봇의 수용성 결정요인과 직무 특성의 조절효과 및 산업 특성에 따른 차이를 규명함으로써 조직 설계와 AI 도입 전략에 대한 함의를 도출할 수 있을 것으로 기대한다.

Abstract

This study aims to identify the factors that influence human workers' acceptance of AI robots and to analyze the moderating effect of job characteristics and differences across industrial sectors. To this end, a measurement model and corresponding indicators were developed by comprehensively reviewing interdisciplinary theoretical backgrounds, including the Technology Acceptance Model, Algorithm Aversion Theory, Human–Robot Interaction Theory, Media Equation Theory, and Task–Technology Fit Theory. The acceptance factors for AI colleagues were operationalized using measurement indicators reflecting trustworthiness, perception of fairness, external similarity, and emotional distance. Based on a survey conducted among workers in Korean industries, the perception of fairness and emotional distance were found to have a significant effect on acceptance. Job characteristics were found to exert a moderating effect on the relationship between trustworthiness and acceptance, and between the perception of fairness and acceptance. Furthermore, the financial industry exhibited higher acceptance than both the manufacturing and public sectors, and the service industry showed higher acceptance than the manufacturing sector. This study is expected to provide practical implications for organizational design and AI adoption strategies by identifying the determinants of AI robot acceptance, validating the moderating role of job characteristics, and quantifying industry-specific differences in acceptance levels.

Keywords:

AI Colleagues Acceptance, Technology Acceptance Model, Algorithm Aversion, Human-Robot Interaction, Media Equation Theory, Task-Technology Fit

키워드:

AI 동료 수용성, 기술수용모델, 알고리즘 회피 이론, 인간-로봇 상호작용 이론, 사회적 존재 이론, 직무-기술 적합성 이론

References

  • 강인성, 나건 (2022). “자율주행 범죄 예방 안심귀가 로봇과 인간의 상호작용을 위한 감정 표현 HRI 디자인제안,” 한국디자인리서치, 제7권 2호, pp.246-255.
    Kang, I. S., and Nah, K. (2022). “Proposal of Emotional HRI Design for human interaction with autonomous driving robot that prevents crime,” Design Research, 7(2), pp.246-255. [ https://doi.org/10.46248/kidrs.2022.2.246 ]
  • 구성환, 신민수 (2013). “모바일오피스의 과업․기술 적합모델과 조직특성이 직무성과에 미치는 영향에 관한 연구,” 한국산학기술학회 논문지, 제14권 2호, pp.644-654.
    Koo, S. N., and Shin, M. S. (2013). “The Study on the Impact of the Task-Technology Fit Model and Organizational Characteristics of the Mobile Office System on the Job Performance,” Journal of the Korea Academia-Industrial cooperatio Society, 14(2), pp.644-654. [ https://doi.org/10.5762/KAIS.2013.14.2.644 ]
  • 김경아, 유원규 (2025). “중소기업 DX에 대한 혁신지원정책의 영향 및 산업특성 조절효과,” 한국비교정부학보, 제29권 2호, pp.277-300.
    Kim, K. A., and You, W. (2025). “The Influence of Innovation Support Policies on SMEs’ Digital Transformation: The Moderating Role of IndustryCharacteristics, Korean Comparative Government Review,” 29(2), pp.277-300.
  • 김다혜, 최미주, 최영준 (2024). “서비스 로봇 사용에 대한 호텔 관리자 및 잠재 고객의 인식 연구,” 관광진흥연구, 특집호, pp.73-95.
    Kim, D. H., Choi, M. J., and Choi, Y. J. (2024). “A Study on the Perceptions of Hotel Managers and Potential Guests Regarding the Use of Service Robots,” Journal of Tourism Enhancement, Special Issue, pp.73-95. [ https://doi.org/10.35498/kotes.2024.se9.73 ]
  • 김민성 (2023). “호황 불구 인력난 조선소, 용접 협동로봇 도입 확산… 안전사고 예방도,” 뉴스1, https://www.news1.kr/industry/energy-heavyindustry/4923986, , 2025년 6월 접속.
    Kim, M. S. (2023). “Despite the boom, shipyards are facing a labor shortage, and the adoption of collaborative welding robots is spreading. This also helps prevent safety accidents,” New1, https://www.news1.kr/industry/energy-heavyindustry/4923986, , retrieved June 2025
  • 김은경 (2025). “AI 대체가능성 큰 직업은… “창의력 필요한 직군도 대체율 높아”,” 연합뉴스, https://www.yna.co.kr/view/AKR20250411111700530, , 2025년 6월 접속.
    Kim, E. K. (2025). “Jobs with High Potential for AI Replacement… “Creativity-Required Jobs Also Have High Replacement Rates”,” Yonhapnews Agency, https://www.yna.co.kr/view/AKR20250411111700530, , retrieved June 2025.
  • 김은영 (2025). “한국 제조업 AI 도입률 25.4%... 타 업종 평균보다 낮아 ‘디지털 격차’ 심화,” AI matters, https://aimatters.co.kr/news-report/ai-report/24669/, , 2025년 8월 접속.
    Kim, E. Y. (2025). “The AI ​​adoption rate in Korea's manufacturing industry is 25.4%, lower than the average for other industries, widening the “digital divide.” AI matters, https://aimatters.co.kr/news-report/ai-report/24669/, , retrieved August 2025.
  • 김태웅 (2012). “프로젝트 공급망 내에서의 정보공유, 인센티브 및 협력의지에 관한 탐색적 연구,” 한국생산관리학회지, 제23권 1호, pp.71-87.
    Kim, T. W. (2012). “An Exploratory Study on Information Sharing, Incentives and Collaboration in Project-based Supply Chain,” Journal of the Korean Production and Operations Management Society, 23(1), pp.71-87.
  • 남상희, 문혜진 (2025). “인공지능 직무대체 인식이 직무소진에 미치는 영향 : 도전평가와 위협평가의 매개효과,” 인적자원개발연구, 제28권 2호, pp.59-83.
    Nam, S. H., and Moon, H. (2025). “Artificial Intelligence Awareness and Job Burnout: The Dual Mediating Roles of Challenge and Threat Appraisal,” Korean Journal of Human Resources Development, 28(2), pp.59-83. [ https://doi.org/10.24991/KJHRD.2025.06.28.59.83 ]
  • 유관령, 이태희 (2024). “산업별 ESG 이행 및 성과 차이에 관한 연구: 유통산업과 금융산업을 중심으로,” 경영컨설팅연구, 제24권 3호, pp.209-220.
    You, G. L, and Lee, T. (2024). “A Study onESG Implementation and Performance Differences by Industry: Focusing on Distribution and Financial Industry,” Korean Management Consulting Review, 24(3), pp.209-220.
  • 윤명출, 송영렬 (2014). “중소기업의 산업특성이 기술혁신과 경영성과에 미치는 영향,” 상업경영연구, 제28권 5호, pp.225-247.
    Yoon, M. C., and Song, Y. R. (2014). “Effect of SMEs’Industry Characteristics on the Technological Innovationand Firm’s Performance,” Korean Journal of Business & Management, 28(5), pp.225-247.
  • 이민영, 박세영, 김보민, 장원석 (2022). “로봇심판 적용에 대한 야구팬의 인식: 인간-로봇 심판 상호작용 관점에서,” 체육과학연구, 제33권 3호, pp.440-450.
    Lee, M., Park, S. Y., Kim, B., and Jang, W. (2022). “Baseball Fans’ Evaluations of Robot Umpire: The Perspective of Human-RobotInteraction,” Korean Journal of Sport Science, 33(3), pp. 440-450. [ https://doi.org/10.24985/kjss.2022.33.3.440 ]
  • 이원준 (2024). “ChatGPT 같은 멘토, 우리 같은 멘토: AI 멘토쉽의 특성과 영향,” 경영학연구, 제53권 6호, pp.1353-1374.
    Lee, W. J. (2024). “Mentor Like ChatGPT, Mentee Like Us: AI Mentorship’s Characteristics and Influence,” Korean Management Review, 53(6), pp.1353-1374. [ https://doi.org/10.17287/kmr.2024.53.6.1353 ]
  • 이한신, 김판수 (2019). “소비자의 기술수용과 저항이 인공지능(AI) 사용의도에 미치는 영향,” 경영학연구, 제48권 5호, pp.1195-1219.
    Lee, H. S., and Kim, P. (2019). “The Effect of Consumer’s Technology Acceptance and Resistance on Intention to Use of Artificial Intelligence(AI),” Korean Management Review, 48(5), pp.1195-1219. [ https://doi.org/10.17287/kmr.2019.48.5.1195 ]
  • 장종원 (2024). “산업별 AI 활용 사례,” 삼성SDS 인사이트 리포트, https://www.samsungsds.com/kr/insights/ai_use_cases.html, , 2025년 8월 접속.
    Jang, J. W. (2024). “AI Use Cases by Industry,” Samsung SDS Insight Report, https://www.samsungsds.com/kr/insights/ai_use_cases.html, , retrived August 2025.
  • 전성일, 이기세, 양해면 (2010). “산업 특성에 따른 연구개발비 지출과 특허취득이 기업가치에 차별적으로 반응하는가?” 지식경영연구, 제11권 3호, pp.1-11.
    Jeon, S. I., Lee, K., and Yang, H. M. (2010). “Does the Differential Effects of R&D Expenditure and Patents on Firm-value Exits between high-tech and low-tech Industries?,” Knwoledge Management Research, 11(3), pp.1-11. [ https://doi.org/10.15813/kmr.2010.11.3.001 ]
  • 정경화 (2025). “[기획]산업 전 영역으로 확산되는 로봇,” 매일일보, https://www.m-i.kr/news/articleView.html?idxno=1232474, , 2025년 6월 접속.
    Jeong, K. W. (2025). “[Planning] Robots Spreading Across All Industries,” MaeilIlbo, https://www.m-i.kr/news/articleView.html?idxno=1232474, , retrieved June 2025.
  • 정예원, 양해은, 권영옥, 장영봉 (2025). “기업의 디지털 활동 역량과 재무 성과 분석: IT와 유통 산업군의 비교,” 경영정보학연구, 제27권 1호, pp.379-393.
    Cheung, Y., Yang, H. E., Kwon, Y. O., and Chang, Y. B. (2025). “The Impact of Digital Activity on Financia lPerformance: A Comparative Analysis of the IT and Retail Industries,” Information Systems Review, 27(1), pp.379-393. [ https://doi.org/10.14329/isr.2025.27.1.379 ]
  • 조영우, 김수형, 김경윤, 양형정 (2023). “산업현장에서 인간-협동로봇 협업 및 상호작용을 위한 감성 컴퓨팅 기술 분석,” 멀티미디어학회논문지, 제26권 2호, pp.380-398.
    Jo, Y. W., Kim, S. H., Kim, K. Y., and Yang, H. J. (2023). “Analysis of Emotional Computing Technology for Human-Cobot Collaboration and Interactionin Industrial Sites,” Journal of Korea Multimedia Society, 26(2), pp. 380-398. [ https://doi.org/10.9717/kmms.2023.26.2.380 ]
  • 주문정 (2025). “제조현장 AI 도입률 3.9% 그쳐… 정보통신 분야 25.7%에 비해 낮아,” ZDNET Korea, https://zdnet.co.kr/view/?no=20250417154416, 2025년 8월 접속.
    Joo, M. J. (2025). “The AI ​​adoption rate in manufacturing remains at 3.9%, lower than the 25.7% rate in the information and communications sector,” ZDNet Korea, https://zdnet.co.kr/view/?no=20250417154416, , retrieved August 2025.
  • 최상묵, 최도영 (2025). “서비스 로봇의 고객지향성과 고객준비도가 서비스 로봇에 대한 신뢰형성 및 신뢰전이에 미치는 영향,” 경영학연구, 제54권 1호, pp. 191-218.
    Choi, S. M., and Choi D. Y. (2025). “The Effects of a Service Robot’s Customer Orientation and Customer Readiness on Trust Formation and Trust Transfer in Service Robots,” Korean Management Review, 54(1), pp.191-218. [ https://doi.org/10.17287/kmr.2025.54.1.191 ]
  • Acemoglu, D., and Restrepo, P. (2020). “Robots and jobs: Evidence from US labor markets. Journal of Political Economy,” 128(6), pp.2188-2244. [https://doi.org/10.1086/705716]
  • Alyoussef, I. Y. (2023). “Acceptance of e-learning in higher education: The role of task-technology fit with the information systems success model,” Heliyon, 9, p.e13751. [https://doi.org/10.1016/j.heliyon.2023.e13751]
  • Bagozzi, R. P., and Yi, Y. (1988). “On the evaluation of structural equation models,” Journal of the Academy of Marketing Science, 16(1), pp.74-94. [https://doi.org/10.1177/009207038801600107]
  • Bekey, G. A. (2005). Autonomous Robots: From Biological Inspiration to Implementation and Control, MIT Press.
  • Berger, B., Adam, M., Rühr, A., and Benlian, A. (2021). “Watch me improve—algorithm aversion and demonstrating the ability to learn,” Business & Information Systems Engineering, 63(1), pp.55-68. [https://doi.org/10.1007/s12599-020-00678-5]
  • Besigomwe, K. (2025). “Human-in-the-Loop Self-Healing Systems: Integrating Human Oversight for Autonomous Failure Detection, Repair and System Optimization,” Cognizance Journal of Multidisciplinary Studies, 5(3), pp.254-267. [https://doi.org/10.47760/cognizance.2025.v05i03.020]
  • Binns, R., Van Kleek, M., Veale, M., Lyngs, U., Zhao, J., and Shadbolt, N. (2018). “‘It’s Reducing a Human Being to a Percentage’; Perceptions of Justice in Algorithmic Decisions,” Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1-14. [https://doi.org/10.1145/3173574.3173951]
  • Biocca, F., and Harms, C. (2002). “Defining and measuring social presence: Contribution to the networked minds theory and measure,” In: Gouveia, F. R., and Biocca, F. (Eds.), The 5th International Workshop on Presence, Porto: University Fernando Pessoa, pp.7-36.
  • Broadbent, E., Stafford, R., and MacDonald, B. (2009). “Acceptance of healthcare robots for the older population: Review and future directions,” International Journal of Social Robotics, 1(4), pp.319-330. [https://doi.org/10.1007/s12369-009-0030-6]
  • Brown, S. (2020). “A New Study Measures the Actual Impact of Robots on Jobs. It’s Significant,” MIT Management Sloan School, https://mitsloan.mit.edu/ideas-made-to-matter/a-new-study-measures-actual-impact-robots-jobs-its-significant, , retrieved June 2025.
  • Brynjolfsson, E., & McAfee, A. (2014). The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies. W. W. Norton & Company.
  • Bughin, J., Seong, J., Manyika, J. Chui, M., and Joshi, R. (2018). Notes from the AI frontier: Modeling the Impact of AI on the World Economy, McKinsey Global Institute.
  • Chakraborty, D., Troise, C, and Bresciani, S. (2025). “Exploring consumer intentions to continue: Integrating task technology fit and social technology fit in generative AI based shopping platforms,” Technovation, 142, p.103189. [https://doi.org/10.1016/j.technovation.2025.103189]
  • Chang, C. M., and Hsu, M. H. (2016). “Understanding the determinants of users' subjective wellbeing in social networking sites: an integration of social capital theory and social presence theory,” Behaviour &. Information Technology. 35(9), pp.720-729. [https://doi.org/10.1080/0144929X.2016.1141321]
  • Chugunova, M., and Sele, D. (2022). “We and it: An interdisciplinary review of the experimental evidence on how humans interact with machines,” Journal of Behavioral and Experimental Economics, 99, p.101897. [https://doi.org/10.1016/j.socec.2022.101897]
  • Colquitt, J. A., and Rodell, J. B. (2015). “Measuring justice and fairness,” In Oxford Handbook of Justice in the Workplace, Cropanzano, R., and Ambrose, M. L. (Eds.). Oxford University Press, pp.187-202.
  • Cortina, J. M. (1993). “What is coefficient alpha? An examination of theory and applications,” Journal of Applied Psychology, 78(1), pp. 98-104. [https://doi.org/10.1037//0021-9010.78.1.98]
  • Davis, F. D. (1989). “Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology,” MIS Quarterly, 13(3), pp.319-340. [https://doi.org/10.2307/249008]
  • Dietvorst, B. J., Simmons, J. P., and Massey, C. (2015). “Algorithm Aversion: People Erroneously Avoid Algorithms After Seeing Them Err,” Journal of Experimental Psychology: General, 144 (1), pp.114-126. [https://doi.org/10.1037/xge0000033]
  • Dishaw, M. T., and Strong, D. M. (1998). “Extending the Technology Acceptance Model with Task-Technology Fit Constructs,” Information & Management, 36(1), pp.9-21. [https://doi.org/10.1016/S0378-7206(98)00101-3]
  • Downen, T., Kim, S., and Lee, L. (2024). “Algorithm aversion, emotions, and investor reaction: Does disclosing the use of AI influence investment decisions?,” International Journal of Accounting Information Systems, 52, p.100664. [https://doi.org/10.1016/j.accinf.2023.100664]
  • Eastwood, B. (2024). “The who, what, and where of AI adoption in America,” MIT Management Sloan School, https://mitsloan.mit.edu/ideas-made-to-matter/who-what-and-where-ai-adoption-america
  • Eisinga, R., Grotenhuis, M. T., and Pelzer, B. (2013). “The reliability of a two-scale: Pearson, Cronbach, or Spearman-Brown?,” International Journal of Public Health, 58(4), pp.637-642. [https://doi.org/10.1007/s00038-012-0416-3]
  • Feng, S., Yamato, N., Ishiguro, H., Shiomi, M., and Sumioka, H. (2025). “Baby schema in human-robot physical interaction: Influence of baby likeness in a communication robot on caregiving behavior,” Computers in Human Behavior: Artificial Humans, 4, p.100150. [https://doi.org/10.1016/j.chbah.2025.100150]
  • Fornell, C., and Larcker, D. F. (1981). “Evaluating Structural Equation Models with Unobervable Variables and Measurement Error,” Journal of Marketing Research, 18(1), pp.39-50. [https://doi.org/10.1177/002224378101800104]
  • Frick, W. (2015). When Your Boss Wears Metal Pants, Harvard Business Review, June, pp. 84-89.
  • Gazit, L., Arazy, O., and Hertz, U. (2023). “Choosing between human and algorithmic advisors: The role of responsibility sharing,” Computers in Human Behavior: Artificial Humans, 1(2), p.100009. [https://doi.org/10.1016/j.chbah.2023.100009]
  • Germann, M., and Merkle, C. (2022). “Algorithm Aversion in Delegated Investing,” Journal of Business Economics, 93, pp.1691-1727. [https://doi.org/10.1007/s11573-022-01121-9]
  • Gombolay, M. C., Gutierrez, R. A., Clarke, S. G., Sturla, G. F., and Shah, J. A. (2015). “Decision-making authority, team efficiency and. Uman worker satisfaction in mixed human-robot team,” Auton Robot, 39, pp. 293-312. [https://doi.org/10.1007/s10514-015-9457-9]
  • Goodhue, D. L., and Thompson, R. L. (1995). “Task-Technology Fit and Individual Performance,” MIS Quarterly, 19(2), pp.213-236. [https://doi.org/10.2307/249689]
  • Graetz, G, and Michaels, G. (2018). “Robots at Work,” The Review of Economics and Statistics, 100(5), pp.753-768. [https://doi.org/10.1162/rest_a_00754]
  • Gunawardena, C. N. (1995). “Social presence theory and implications for interaction and collaborative learning in computer conferences,” Internation Journal of Educational Telecommunications. 1(2/3), pp.147-166.
  • Hancock, P. A., Billings, D. R., Schaefer, K. E., et al. (2011). “A meta-analysis of factors affecting trust in human-robot interaction,” Human Factors, 53(5), pp.517-527. [https://doi.org/10.1177/0018720811417254]
  • Heßler, P. O., Pfeiffer, J., and Hafenbrädl, S. (2022). “When self-humanization leads to algorithm aversion: what users want from decision support systems on prosocial microlending platforms,” Business & Information Systems Engineering, 64(3), pp.275-292. [https://doi.org/10.1007/s12599-022-00754-y]
  • Howard, M. C., and Hair Jr., J. F. (2023). “Integrating the expanded task–technology fit theory and the technology acceptance model: A multi-wave empirical analysis,” AIS Transactions on Human–Computer Interaction, 15(1), pp.83-110. [https://doi.org/10.17705/1thci.00184]
  • Jauernig, J., Uhl, M., and Walkowitz, G. (2022). “People prefer moral discretion to algorithms: algorithm aversion beyond intransparency,” Philosophy & Technology, 35(1), pp.1-25. [https://doi.org/10.1007/s13347-021-00495-y]
  • Jarrahi, M. H. (2018). “Artificial Intelligence and the Future of Work: Human-AI Symbiosis in Organizational Decision Making,” Business Horizons, 61, pp.577-586. [https://doi.org/10.1016/j.bushor.2018.03.007]
  • Jiahe, P., Sarah, S., Yan, Z., Ramtin, T., Muhammad, B., and Wafa, J. (2025). “OfficeMate: Pilot Evaluation of an Office Assistant Robot,” 2025 20th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 1529-1533 [https://doi.org/10.1109/HRI61500.2025.10974132]
  • Jie, Y. E. (2025). 4 in 10 Korean firms turn to AI to boost efficiency: survey, The Korean Herald, https://www.koreaherald.com/article/10504242
  • Julious, S. A. (2005). “Sample size of 12 per group rule of thumb for a pilot study,” Pharmaceutical Statistics, 4, pp.287-291. [https://doi.org/10.1002/pst.185]
  • Jung, M., and Seiter, M. (2021). “Towards a better understanding on mitigating algorithm aversion in forecasting: an experimental study,” Journal of Management Control, 32 (4), pp.495-516. [https://doi.org/10.1007/s00187-021-00326-3]
  • Kätsyri, J., Förger, K., Mäkäräinen, M., and Takala, T. (2015). “A review of empirical evidence on different uncanny valley hypotheses: Support for perceptual mismatch as one road to the valley of eeriness,” Frontiers in Psychology, 6, p.390. [https://doi.org/10.1007/s12369-025-01282-x]
  • Kodur, K., Zand, M., Tognotti, M., Banerjee, S., Banerjee, N. K., and Kyrarini, M. (2025). “Exploring the Dynamics of Human-Robot Interaction: Robot Error, Sentiment Analysis, and Politeness,” International Journal of Social Robotics, pp.1-14. [https://doi.org/10.1007/s12369-025-01282-x]
  • Koh, L. Y., and Yuen, K. F. (2025). “Individual-, task-, and technology-fit perspective of autonomous delivery robots confirmation and adoption in smart cities,” International Journal of Hospitality Management, 128, p.104182. [https://doi.org/10.1016/j.ijhm.2025.104182]
  • Lee, M. K. (2018). “Understanding Perception of Algorithmic Decisions: Fairness, Trust, and Emotion in Response to Algorithmic Management,” Big Data & Society, January-June, pp.1-16. [https://doi.org/10.1177/2053951718756684]
  • Lin, X., Wang, T., and Sheng, F. (2025). “Exploring the dual effect of trust in GAI on employees’ exploitative and exploratory innovation,” Humanities & Social Sciences Communications, 12(1), p.663. [https://doi.org/10.1057/s41599-025-04956-z]
  • Liu, T. (2024). “Research on Legal Responsibility Attribution for Autonomous Systems: An AI Governance Perspective,” Science of Law Journal, 3(7), pp.166-174. [https://doi.org/10.23977/law.2024.030722]
  • Mavridis, N. (2015). “A Review of Verbal and Non-Verbal Human–Robot Interactive Communication,” Robotics and Autonomous Systems, 63, pp. 22-35. [https://doi.org/10.1016/j.robot.2014.09.031]
  • Mayer, H., Yee, L., Chui, M., and Roberts, R. (2025). Superagency in the Workplace: Empowering People to Unlock AI’s Full Potential, McKinsey & Company, https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/superagency-in-the-workplace-empowering-people-to-unlock-ais-full-potential-at-work
  • McGahan, A. M., and Porter, M. E. (1997). “How Much Does Industry Matter, Really?,” Strategic Management Journal, 18(Summer Special Issue), pp.15-30. [https://doi.org/10.1002/(SICI)1097-0266(199707)18:1+<15::AID-SMJ916>3.0.CO;2-1]
  • Nass, C., Steuer, J., and Rauber, E. R. (1994). “Computers are Social Actors,” Conference on Human Factors in Computing System, April 24-28, pp.72-78. [https://doi.org/10.1145/191666.191703]
  • Nunnally, J., and Bernstein, I. (1994). Psychometric Theory (3rd ed.), McGraw-Hill.
  • OECD. (2019). Artificial Intelligence in Society, OECD Publishing. Paris. [https://doi.org/10.1787/eedfee77-en]
  • Ose-Frimpong, K., and McLean, G. (2018). “Examining online social brand engagement: A social presence theory perspective,” Technological Forecasting & Social Change, 123, pp.10-21. [https://doi.org/10.1016/j.techfore.2017.10.010]
  • Othman, U., and Yang, E. (2023). “Human-Robot Collaborations in Smart Manufacturing Environments: Review and Outlook,” Sensors, 23(12), p.5663. [https://doi.org/10.3390/s23125663]
  • Pan, X. (2020). “Technology Acceptance, Technological Self-Efficacy, and Attitude Toward Technology-Based Self-Directed Learning: Learning Motivation as a Mediator,” Frontiers in Psychology, 11, p.564294. [https://doi.org/10.3389/fpsyg.2020.564294]
  • Pinney, J., Carroll, F., and Newbury, P. (2022). “Human-robot interaction: the impact of robotic aesthetics on anticipated human trust,” PeerJ Computer Science, 8, p.e837. [https://doi.org/10.7717/peerj-cs.837]
  • Porter, M. E. (1980). Competitive Strategy: Techniques for Analyzing Industries and Competitors, NY: Free Press.
  • Przegalinska, A., Triantoro, T., Kovbasiuk, A., Ciechanowski, L., Freeman, R. B., & Sowa, K. (2024). “Collaborative AI in the workplace: Enhancing organizational performance through resource-based and task–technology fit perspectives,” International Journal of Information Management, 81, p.102853. [https://doi.org/10.1016/j.ijinfomgt.2024.102853]
  • Reeves, B., and Nass, C. (1996). The Media Equation: How People Treat Computers, Television, and New Media like Real People and Places, Cambridge University Press.
  • Reich, T., Kaju, A., and Maglio, S. J. (2023). “How to overcome algorithm aversion: Learning from mistakes,” Journal of Consumer Psychology, 33(2), pp.285-302. [https://doi.org/10.1002/jcpy.1313]
  • Shea, R. N. (2025). Safer, Clearer, and Morea Explicit: ISO 10218 Gets a Makeover, Universal Robots, available at https://www.universal-robots.com/blog/safer-clearer-and-more-explicit-iso-10218-gets-a-makeover/, , retrieved June 2025.
  • Shipps, A. (2024). AI Assistant Monitors Teamwork to Promote Effective Collaboration, MIT News, https://news.mit.edu/2024/ai-assistant-monitors-teamwork-promote-effective-collaboration-0819, , retrieved June 2025.
  • Short, J., Williams, E., and Christie, B. (1976). The Social Psychology of Telecommunications. John Wiley and Sons, London.
  • Tu, C. H., and Mclsaac, M. (2002). “The relationship of social presence and interaction in online classes,” The American Journal of Distance Education, 16(3), pp.131-150. [https://doi.org/10.1207/S15389286AJDE1603_2]
  • U.S. Bureau of Labor Statistics (2025). Occupational Outlook Handbook, https://data.bls.gov/search/query/results?cx=013738036195919377644%3A6ih0hfrgl50&q=artificial+intelligence+inurl%3Abls.gov%2Fooh%2F, , retrieved June 2025.
  • Venkatesh, V., and Bala, H. (2008). “Technology acceptance model 3 and a research agenda on interventions,” Decision Sciences, 39(2), pp.273-315. [https://doi.org/10.1111/j.1540-5915.2008.00192.x]
  • Venkatesh, V., and Davis, F. D. (2000). “A Theoretical Extension of the Technology Acceptance Model: Four Longitudinal Field Studies,” Management Science, 46(2), pp.186-204. [https://doi.org/10.1287/mnsc.46.2.186.11926]
  • Wang, Y., Sun, X., Zhang, X., and Shi, H. (2025). “Do you feel empowered by AI service robot? An exploration of consumer’s social power perception in human-AI interaction,” Journal of Retailing and Consumer Services, 87, p.104434. [https://doi.org/10.1016/j.jretconser.2025.104434]
  • Wamba-Taguimdje, S. L., Wamba, S. F., Kamdjoug, J. R. K., and Wanko, C. E. T. (2020). “Influence of Artificial Intelligence (AI) on Firm Performance: The Business Value of AK-based Transformation Projects,” Business Process Management Journal, 26(7), pp. 1893-1924. [https://doi.org/10.1108/BPMJ-10-2019-0411]
  • Waytz, A., Heafner, J., and Epley, N. (2014). “The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle,” Journal of Experimental Social Psychology, 52, pp.113-117. [https://doi.org/10.1016/j.jesp.2014.01.005]
  • Wuman, P. R., D’Andrea, R., and Mountz, M. (2008). “Coordinating Hundreds of Cooperatives,” Autonomous Vehicles in Warehouses, AI Magazine, 29(1), pp.9-20.
  • Yam, K. C., Bigman, Y., and Gray, K. (2021). “Reducing the uncanny valley by dehumanizing humanoid robots,” Computers in Human Behavior, 125, p.106945. [https://doi.org/10.1016/j.chb.2021.106945]
  • Yamaguch, M. (2025). “Item-level implicit affective measures reveal the uncanny valley of robot faces,” International Journal of Human – Computer Studies, 196, p.103443. [https://doi.org/10.1145/3732777]
  • Yu, G., Tan, G., Huang, H., Zhang, Z., Chen, P., Natella, R., and Zheng, Z. (2025). “A Survey on Failure Analysis and Fault Injection in AI Systems,” ACM Transactions on Software Engineering and Methodology. [https://doi.org/10.1145/3732777]
  • Zhang, X., King, A., and Prior, H. (2025). “Attitudes before actions: how music teachers’ technological acceptance and competence shape technological behaviour in China,” Humanities Social Science Communications, 12, p.1222.

∙ 저자 양재용은 현재 한양대학교 산업융합학부 경영공학 전공 조교수로 재직 중이다. 한양대학교에서 경영컨설팅학 박사 학위를 취득하였다. 대학 교수로 임용되기 전까지 기아자동차, 신세계I&C, LG이노텍에서 자산관리, 재무관리, 생산운영관리, 사업기획, 마케팅, IT시스템 기획 등의 업무를 담당했다. 주요 연구분야는 AI와 디지털 전환, ESG와 지속가능성, 제품서비스시스템(PSS) 등이다.

∙ 저자 박광태는 현재 고려대학교 경영대학 교수로 재직 중이다. 서울대학교에서 학사와 석사를, 미국 버클리대학에서 산업공학/경영과학 박사 학위를 취득하였다. 주요 연구분야는 SCM, 서비스경영, 혁신 등이다.