1. U. Frith, C. Frith, The social brain: Allowing humans to boldly go where no other
species has been. Philos. Trans. R. Soc. Lond. B Biol. Sci. 365, 165–176 (2010).
2. M. Heldner, J. Edlund, Pauses, gaps and overlaps in conversations. J. Phonetics 38,
555–568 (2010).
3. S. C. Levinson, F. Torreira, Timing in turn-taking and its implications for processing
models of language. Front. Psychol. 6, 731 (2015).
4. A. Stolk, L. Verhagen, I. Toni, Conceptual alignment: How brains achieve mutual
understanding. Trends Cogn. Sci. 20, 180–191 (2016).
5. A. M. Mastroianni, D. T. Gilbert, G. Cooney, T. D. Wilson, Do conversations end when
people want them to? Proc. Natl. Acad. Sci. U.S.A. 118,1–9 (2021).
6. C. D. Mortensen, Communication Theory (Transaction Publishers, 2011).
7. B. J. Hedge, B. S. Everitt, C. D. Frith, The role of gaze in dialogue. Acta Psychol. (Amst.)
42, 453–475 (1978).
8. L. Hirvenkari et al., Influence of turn-taking in a two-person conversation on the gaze
of a viewer. PLoS One 8, e71569 (2013).
9. A. Kendon, Some functions of gaze-direction in social interaction. Acta Psychol. 26,
22–63 (1967).
10. N. Binetti, C. Harrison, A. Coutrot, A. Johnston, I. Mareschal, Pupil dilation as an index
of preferred mutual gaze duration. R. Soc. Open Sci. 3, 160086 (2016).
11. M. Jarick, R. Bencic, Eye contact is a two-way street: Arousal is elicited by the sending
and receiving of eye gaze information. Front. Psychol. 10, 1262 (2019).
12. A. Mazur et al., Physiological aspects of communication via mutual gaze. AJS 86,
50–74 (1980).
13. D. H. Abney, S. H. Suanda, L. B. Smith, C. Yu, What are the building blocks of parent-
infant coordinated attention in free-flowing interaction? Infancy 25, 871–887 (2020).
14. L. Conty, C. Tijus, L. Hugueville, E. Coelho, N. George, Searching for asymmetries in
the detection of gaze contact versus averted gaze under different head views: A
behavioural study. Spat. Vis. 19, 529–545 (2006).
15. A. Senju, T. Hasegawa, Direct gaze captures visuospatial attention. Vis. Cogn. 12,
127–144 (2005).
16. G. Aston-Jones, J. Rajkowski, P. Kubiak, T. Alexinsky, Locus coeruleus neurons in
monkey are selectively activated by attended cues in a vigilance task. J. Neurosci. 14
,
4467–4480 (1994).
17. J. Rajkowski, Correlations between locus coeruleus (LC) neural activity, pupil diameter
and behavior in monkey support a role of LC in attention. Society for Neuroscience
Abstracts 19, 974 (1993).
18. D. Alnæs et al., Pupil size signals mental effort deployed during multiple object
tracking and predicts brain activity in the dorsal attention network and the locus
coeruleus. J. Vis. 14 , 1 (2014).
19. M. S. Gilzenrat, S. Nieuwenhuis, M. Jepma, J. D. Cohen, Pupil diameter tracks changes
in control state predicted by the adaptive gain theory of locus coeruleus function.
Cogn. Affect. Behav. Neurosci. 10, 252–269 (2010).
20. S. Joshi, Y. Li, R. M. Kalwani, J. I. Gold, Relationships between pupil diameter and neuronal
activity in the locus coeruleus, colliculi, and cingulate cortex. Neuron 89,221–234 (2016).
21. B. Hoeks, W. J. M. Levelt, Pupillary dilation as a measure of attention: A quantitative
system analysis. Behav. Res. Meth. Instrum. Comput. 25,16–26 (1993).
22. O. E. Kang, K. E. Huffer, T. P. Wheatley, Pupil dilation dynamics track attention to
high-level information. PLoS One 9, e102463 (2014).
23. O. Kang, M. R. Banaji, Pupillometric decoding of high-level musical imagery. Con-
scious. Cogn. 77, 102862 (2020).
24. R. L. van den Brink, P. R. Murphy, S. Nieuwenhuis, Pupil diameter tracks lapses of
attention. PLoS One 11, e0165274 (2016).
25. J. Smallwood et al., Pupillometric evidence for the decoupling of attention from
perceptual input during offline thought. PLoS One 6, e18298 (2011).
26. S. M. Wierda, H. van Rijn, N. A. Taatgen, S. Martens, Pupil dilation deconvolution
reveals the dynamics of attention at high temporal resolution. Proc. Natl. Acad. Sci.
U.S.A. 109, 8456–8460 (2012).
27. G. Shteynberg, Shared attention. Perspect. Psychol. Sci. 10, 579–590 (2015).
28. E. Prochazkova et al., Pupil mimicry promotes trust through the theory-of-mind
network. Proc. Natl. Acad. Sci. U.S.A. 115, E7265–E7274 (2018).
29. J. A. Van Breen, C. K. W. De Dreu, M. E. Kret, Pupil to pupil: The effect of a partner’s
pupil size on (dis)honest behavior. J. Exp. Soc. Psychol. 74, 231–245 (2018).
30. O. Kang, T. Wheatley, Pupil dilation patterns reflect the contents of consciousness.
Conscious. Cogn. 35, 128–135 (2015).
31. O. Kang, T. Wheatley, Pupil dilation patterns spontaneously synchronize across in-
dividuals during shared attention. J. Exp. Psychol. Gen. 146, 569–576 (2017).
32. U. Hasson, C. D. Frith, Mirroring and beyond: Coupled dynamics as a generalized framework
for modelling social interactions. Philos. Trans. R. Soc. Lond. Ser. B, Biol. Sci. 371, 1693 (2016).
33. V. Leong et al., Speaker gaze increases information coupling between infant and
adult brains. Proc. Natl. Acad. Sci. U.S.A. 114, 13290–13295 (2017).
34. S. Dikker et al., Brain-to-brain synchrony tracks real-world dynamic group interactions
in the classroom. Curr. Biol. 27, 1375–1380 (2017).
35. S. Kinreich, A. Djalovski, L. Kraus, Y. Louzoun, R. Feldman, Brain-to-brain synchrony
during naturalistic social interactions. Sci. Rep. 7, 17060 (2017).
36. I. Konvalinka et al., Synchronized arousal between performers and related spectators
in a fire-walking ritual. Proc. Natl. Acad. Sci. U.S.A. 108, 8514–8519 (2011).
37. J. Hirsch, X. Zhang, J. A. Noah, Y. Ono, Frontal temporal and parietal systems syn-
chronize within and across brains during live eye-to-eye contact. Neuroimage 157,
314–330 (2017).
38. M. S. Kelley, J. A. Noah, X. Zhang, B. Scassellati, J. Hirsch, Comparison of human social
brain activity during eye-contact with another human and a humanoid robot. Front.
Robot. AI 7, 599581 (2021).
39. T. Koike et al., Neural substrates of shared attention as social memory: A hyper-
scanning functional magnetic resonance imaging study. Neuroimage 125, 401–412
(2016).
40. T. Koike, M. Sumiya, E. Nakagawa, S. Okazaki, N. Sadato, What makes eye contact
special? Neural substrates of on-line mutual eye-gaze: A hyperscanning fMRI study.
eNeuro 6, ENEURO.0284-18.2019 (2019).
41. J. A. Noah et al., Real-time eye-to-eye contact is associated with cross-brain neural
coupling in angular gyrus. Front. Hum. Neurosci. 14, 19 (2020).
42. L. J. Silbert, C. J. Honey, E. Simony, D. Poeppel, U. Hasson. Coupled neural systems
underlie the production and comprehension of naturalistic narrative speech. Proc.
Natl. Acad. Sci. U.S.A. 111, E4687–E4696 (2014).
43. G. J. Stephens, L. J. Silbert, U. Hasson, Speaker-listener neural coupling underlies
successful communication. Proc. Natl. Acad. Sci. U.S.A. 107, 14425–14430 (2010).
44. J. Chen et al., Shared memories reveal shared structure in neural activity across in-
dividuals. Nat. Neurosci. 20, 115
–125 (2017).
45. S. Bögels, L. Magyari, S. C. Levinson, Neural signatures of response planning occur
midway through an incoming question in conversation. Sci. Rep. 5, 12881 (2015).
46. E. Redcay, L. Schilbach, Using second-person neuroscience to elucidate the mecha-
nisms of social interaction. Nat. Rev. Neurosci. 20, 495–505 (2019).
47. L. Schilbach et al., Toward a second-person neuroscience. Behav. Brain Sci. 36,
393–414 (2013).
48. D. Bates, M. Maechler, B. Bolker, S. Walker, Fitting linear mixed-effects models using
lme4. J. Stat. Software 67,1–48 (2015).
49. D. R. Rutter, G. M. Stephenson, K. Ayling, P. A. White, The timing of looks in dyadic
conversation. Br. J. Soc. Clin. Psychol. 17,17–21 (1978).
50. J. Launay, B. Tarr, R. I. M. Dunbar, Synchrony as an adaptive mechanism for large-scale
human social bonding. Ethology 122, 779–789 (2016).
51. R. Mogan, R. Fischer, J. A. Bulbulia, To be in synchrony or not? A meta-analysis of
synchrony’s effects on behavior, perception, cognition and affect. J. Exp. Soc. Psychol.
72,13–20 (2017).
52. T. Wheatley, O. Kang, C. Parkinson, C. E. Looser, From mind perception to mental
connection: Synchrony as a mechanism for social understanding. Soc. Personal. Psy-
chol. Compass 6, 589–606 (2012).
53. M. J. Hove, J. L. Risen, It’s all in the timing: Interpersonal synchrony increases affili-
ation. Soc. Cogn. 27, 949–960 (2009).
54. F. Ramseyer, W. Tschacher, Nonverbal synchrony in psychotherapy: Coordinated body
movement reflects relationship quality and outcome. J. Consult. Clin. Psychol. 79,
284–295 (2011).
55. S. S. Wiltermuth, C. Heath, Synchrony and cooperation. Psychol. Sci. 20,1–5 (2009).
56. C. Parkinson, A. M. Kleinbaum, T. Wheatley, Similar neural responses predict
friendship. Nat. Commun. 9, 332 (2018).
57. R. Hari, L. Henriksson, S. Malinen, L. Parkkonen, Centrality of social interaction in
human brain function. Neuron 88, 181–193 (2015).
58. B. Beebe, M. Steele, How does microanalysis of mother–infant communication inform
maternal sensitivity and infant attachment? Attach. Hum. Dev.
15, 583–602 (2013).
59. R. Feniger-Schaal et al., Would you like to play together? Adults’ attachment and the
mirror game. Attach. Hum. Dev. 18,33–45 (2016).
60. S. Wallot, P. Mitkidis, J. J. McGraw, A. Roepstorff, Beyond synchrony: Joint action in a
complex production task reveals beneficial effects of decreased interpersonal syn-
chrony. PLoS One 11, e0168306 (2016).
61. L. Galbusera, M. T. M. Finn, W. Tschacher, M. Kyselo, Interpersonal synchrony feels
good but impedes self-regulation of affect. Sci. Rep. 9, 14691 (2019).
62. J. Hale, J. A. Ward, F. Buccheri, D. Oliver, A. F. C. Hamilton, Are you on my wavelength? In-
terpersonal coordination in naturalistic conversations. J. Nonverbal Behav. 44,63–83 (2020).
63. I. Ravreby, Y. Shilat, Y. Yeshurun, Reducing synchronization to increase interest im-
proves interpersonal liking. bioRxiv [preprint] (2021). https://www.biorxiv.org/content/
10.1101/2021.06.30.450608v1.full.pdf. Accessed 7 July 2021.
64. A. Dahan, L. Noy, Y. Hart, A. Mayo, U. Alon, Exit from synchrony in joint improvised
motion. PLoS One 11, e0160747 (2016).
65. O. Mayo, I. Gordon, In and out of synchrony—Behavioral and physiological dynamics
of dyadic interpersonal coordination. Psychophysiology 57, e13574 (2020).
66. M. M. Egbert, Schisming: The transformation from a single conversation to multiple
conversations. Res. Lang. Soc. Interact. 30,1–51 (1997).
67. P. Hömke, J. Holler, S. C. Levinson, Eye blinking as addressee feedback in face to face
conversation. Res. Lang. Soc. Interact. 50,54–70 (2017).
68. S. E. Clayman, “Turn-constructional units and the transition-relevance place” in The Handbook
of Conversation Analysis, J. Sidell, T. Stivers, Eds. (Blackwell Publishing, 2013). pp. 151–166.
69. H. Sacks, E. A. Schegloff, G. Jefferson, “A simplest systematics for the organization of
turn taking for conversation.” in Studies in The Organization of Conversational In-
teraction, J. Schenkein, Ed. (Elsevier, 1978). pp. 7–55.
70. F. Maqsood, Effects of varying light conditions and refractive error on pupil size.
Cogent Med. 4, 1338824 (2017).
71. J. H. Cheong, S. Brooks, L. J. Chang, FaceSync: Open source framework for recording
facial expressions with head-mounted cameras. F1000 Res. 8, 702 (2019).
72. J. W. Peirce, PsychoPy—Psychophysics software in Python. J. Neurosci. Methods 162,
8
–13 (2007).
73. H. Brugman, A. Russel, X. Nijmegen, Annotating Multi-Media/Multi-Modal Resources
with ELAN (LREC, 2004).
74. S. L. Rogers, O. Guidetti, C. P. Speelman, M. Longmuir, R. Phillips, Contact is in the eye
of the beholder: The eye contact illusion. Perception 48, 248–252 (2019).
75. D. J. Berndt, J. Clifford, Using dynamic time warping to find patterns in time series.
KDD Workshop, 48, 359–370 (1994).
76. S. Epskamp, M. K. Deserno, L. F. Bringmann, mlVAR: Multi-level vector autore-
gression. R Package Version 0.4 (2017). https://CRAN.R-project.org/package=mlVAR.
Accessed 3 September 2021.
77. S. Wohltjen, T. Wheatley, eyeContact-in-conversation. GitHub. https://github.com/
sophiewohltjen/eyeContact-in-conversation. Deposited 6 April 2021.
8of8
|
PNAS Wohltjen and Wheatley
https://doi.org/10.1073/pnas.2106645118 Eye contact marks the rise and fall of shared attention in conversation
Downloaded from https://www.pnas.org by 50.220.212.138 on November 20, 2022 from IP address 50.220.212.138.