(1) Overview


Computer-assisted behavioral observation systems

With the advent of hand held computer devices, behavioral observation coding evolved from checklists, clipboards and stopwatches to slick GUIs with real-time data storage. Used by ethologists in the field, educators in the classroom, and behavioral scientists across diverse environments including hospitals, homes, and playgrounds these contemporary codings systems have the ability to cover more behaviors with greater specificity and higher reliability [1].

Existing coding systems are either one of two types: open-source or proprietary. Within each type, the scope of the program ranges from a specific topic area addressing a narrow research question to a more general omnibus system, built to cover a wide range of behavioral areas. Proprietary systems tend to be well designed, more often general in their scope, and often expensive. For example, the Noldus Observer XT [2] is typically used to observe human interaction across disparate settings, ranging from parent-child interaction [3] to patient-doctor conferences [4].

Conversely, there are ample, typically open-source, one-off computer programs that address a narrow band of research objectives focusing on specific populations. These are well suited for pilot studies, limited budget projects, or niche research situations. Because of their limited focus, these programs tend to be simple, built with accessible code, and have minimal bloat, either in the code or user interface (UI). pyObs is in this latter genre – built to capture time-stamped behavior, affect, and physical movement in preschool children, this software was a critical component in a multi-year NSF funded project, entitled Peers Across Landscapes (PALs), that sought to describe the genesis of friendships in 3–5 year olds (Li & Griffin, 2013; Torrens & Griffin, 2012).

Implementation and architecture

Although pyObs typifies software of the type described above (i.e., specialized needs for a specific project), from the beginning, the design focused on encompassing the critical characteristics associated with a general behavioral observation coding system (see [5] for a recent review). During its evolution, from conception to implementation, each feature, consistent with the design architecture, emphasized speed and accuracy with minimal error. Specifically, the following outline covers the design prescript used during piloting and guided minor changes over the life of the project:

  1. Graphical User Interface:
    • Uncluttered User Interface (UI)
    • Logical menu arrangement and flow
  2. Data Entry:
    • Immediate feedback on entry mistakes
    • Summary error checking
    • Data cannot be stored until all errors are cleared
  3. Project Specific Needs:
    • Temporal cues for the coder
      • – Clock with visual and audio feedback
    • Data are time-stamped with geospatial reference
      • – X, y location
      • – Farthest distance
  4. Data output:
    • Local storage on the hard drive
    • Standard format (e.g., csv)
    • Human readable
    • Configurable for immediate statistical analysis and database storage
  5. Extensible:
    • Scheme and UI can be changed as project needs change
    • New codes can be added with minimal programming
    • Allow rapid reconfiguration when subject and geospatial features change


pyObs captures naturally-occurring free-play interactions among preschool children [6]. Over the 3 years of the PALs study, from piloting to implementation, the emphasis was on coder accuracy, error-less data entry, and speed. A large team of undergraduates using tablet PCs collected the data; to reduce the likelihood entry mistakes, we focused on making the flow of the Graphical User Interface (GUI) intuitive. Likewise, data entry had to be simple with robust error checking methods.

During the PALs study, assessing interrater reliability commenced from the initial piloting until data collection ended, 2.5 years and 178,565 observations later; Kappa scores ranged from .70 to .99 for all coding structures. Details of the behavioral coding associated with the PALs study are elsewhere [7, 8]. The general scheme was as follows, while watching a randomly chosen child for 10 s, the observer coded the child’s behavior in a naturalistic setting, recording both social and geographic information. Specifically, a coder recorded whether the child was alone, with a teacher, playing with other children, or passively engaged in group behavior through parallel play – children are playing in proximity to each other, but not with each other.

After determining the appropriate code, observers next recorded the associated affect: positive, negative, or neutral while in the presence of social peers – those involved in direct interaction, or area peers – those in the physical vicinity but not interacting with the target child. The specificity of the data, along with its breath, enabled us to analyze how activity type and physical location commingled to influence micro-social processes among preschool children. See Figure 1 for a schematic of this procedure.

Figure 1 

pyObs Target Child Scan for PALs.

Data Flow

As shown in Figure 1, pyObs has 5 primary components relating to the target child: Behavior, Task, Peers, Affect, and Physical Location. Within pyObs’ architecture, assuming the selected child is present and available, relevant interconnected menu boxes require completion before submission for data storage. Figure 2 illustrates the GUI data entry flow. Elaboration of the study and the coding procedure is available elsewhere [7, 8].

Figure 2 

pyObs Flow Chart of Data Entry for PALs.


Using pyObs requires minimal computer programing skills. As shown in Figure 3, data gets entered via drop down menus or check boxes. pyObs populates the selection options using either external or internal lists, depending on researcher preference. Output goes to the local directory unless directed elsewhere via an external configuration file. Data are saved as simple ascii text in csv format; in turn, these output are easily converted to a pandas DataFrame, a SQL database or a spreadsheet for subsequent manipulation and statistical analysis.

Figure 3 

pyObs User Interface for the PALs project (Windows Platform).

Depending on specific operating platform requirements, it may be necessary to make minor adjustments (e.g., directory dependences, color schemes) but the code should transfer without modification. Python programing etiquette encourages small modules, each reflecting a different function; conversely, pyObs is a single module. Because the PALs project required frequent updating across hand-held devices (e.g., Windows based tablets with a digitizer pen), to minimize the likelihood of introducing errors, we decided in the development phase that a large single module was preferable to multiple smaller modules. Although not ideal from the perspective of a programmer, this strategy ensures rapid and error free updating across machines. pyObs’ GUI uses wxPython [9], a wrapper for the cross-platform GUI API wxWidgets, written in C++. Coders used the digitizer pen to move through the appropriate sequence of drop down menus and check boxes (see Figure 3). Instead of a digitizer pen, current technology allows data entry using a touchscreen.

Geospatial Reference

Actuate observation of behavior in naturalistic settings require context; pyObs does this by embedding into the GUI a gridded bitmap image (BMP) file to assist geocoding of the target child’s location. Each relevant geographic area shows on the GUI as a cartographic representation of the environment, with cues for location and relative distances between structures and objects in the mapped area. Grids on the BMP map help the observer specify location; grid size, of course, can vary in size to best meet the research requirements.

In the PALs study, for example, we used 3 ft grids on maps of the outside and classroom environments (see Figure 3). Embedded images are either raw BMP files or first converted to python code with the wxPython img2py tool; using the latter method decreases loading time.

To enhance geospatial accuracy, and increase reliability, a tap on the image generates an exploding map. Figure 4 shows an example of enlarged classroom map. Tapping a location on the enlarged map returns a time-stamped x, y coordinate reference, along with a visual confirmation (see Figure 5).

Figure 4 

Exploding map ready of coder input (OS X).

Figure 5 

Exploding map with entry confirmation (OS X).

In the PALs implementation, we used three different geodata points: the start location (a [x, y, t] point in space and time), stop location, and farthest distance traveled. Including the farthest distance allowed us to trace the typical amount of movement within the 10 s window.


As pyObs evolved, novel research opportunities arose, resulting in two variants that capture selected micro-features of social interaction among children. Building a typical variant of pyObs, such as those described below, requires modifying small sections of original code. Because pyObs was built as an omnibus behavioral observation coding system, with its myriad components, changes made to the pyObs code usually are done by either removing or commenting out irrelevant functions and, if needed, modifying the appropriate GUI boxes locations and titles. This takes minimal time and programmer experience.

Additionally, depending on the researcher’s preference, populating the choices shown in the GUI are done using either an internal list or an external text configuration file. Although in pyObs the child’s range of activities are hard coded in the python module it is sometimes easier to use an external file. For example, the following function, taken from the VisualAttention variant described below, uses the activityList.txt file. Contained within the file is a list showing the range of possible child activities (e.g., nap, playgroup). To modify the range, the researcher simply alters the list.

def   GetActivityList (self):
      activityList = []
      f = open (‘activityList.txt’)
      line = f.readline ()
      while (len (line)>0):
          line = line.strip ().split (‘,’)
          if len (line) == 1:
              activityList.append (“)
              activityList.append (line[1])
          line = f.readline ()
      f.close ()
      return activityList

1. Visual Attention

One variation of pyObs was the foundation coding system used to capture visual attention among preschoolers [10, 11]. To gather these data, observers watched a target child for a 6 s interval and recorded the identity of peers receiving a unit of visual regard, along with affect, from the target; visual regard, defined as the orientation of head and eyes toward the peer recipient, measures social engagement.

Figure 6 shows the GUI for this variant. Note that the interface has no GIS map, instead the observer identifies the location of the interaction using a box for classroom, and then a site within the room.

Figure 6 

Visual Attention (OS X).

2. Affect Expression

In a more complicated GUI, we developed an interface that covered a larger space – the ground floor of a building with separate classrooms (see Figure 7). This code configuration captures a target child’s affect expression during an interaction with another child while engaged in a specific activity; the observation occurs during a 15 s window of time [10, 11]. With this interface, the observer codes the initiator and target of the interaction plus the observed affective valence (positive, neutral and negative). Like the example shown in Figure 5, each section of the map explodes to enhance coder accuracy.

Figure 7 

Affect Expression (OS X).

Quality control

pyObs runs on operating systems that support Python. It has been tested on Linux, Mac OS X, and Windows.

(2) Availability

Operating system

pyObs runs on operating systems that support Python; for example, the Anaconda Python distribution by Continuum Analytics. It has been tested on Linux (Ubuntu, Mint); Mac OS X, and Windows (XP, 7, 10).

Programming language

Python 2.7; pyObs has not been tested for Python 3.5+ compatibility using wxPython 4.0.

Additional system requirements

Requires input method: mouse, digitizer pen, or touchscreen – depending on the operating system.


wxPython 3.0+; it has not been tested with Python 3.5 using recently release wxPython 4.0; it should work with minimal code changes. wxPython wraps the wxWidgets C++ toolkit and provides access to the user interface portions of the wxWidgets API. Numpy.

List of contributors

William Griffin, Xun Li

Software location


Name: Zenodo

Persistent identifier:https://doi.org/10.5281/zenodo.804072

Licence: MIT

Publisher: William Griffin

Version published: 2.2.1

Date published: 07/06/17

Code repository

Name: Github

Persistent identifier:https://github.com/billgriffin/pyObs

Licence: MIT

Date published: 07/06/17



(3) Reuse potential

As shown in the Variants section of this paper, pyObs adapts to the needs of the researcher. At its core, pyObs is a behavioral observation toolkit – all critical components are available, leaving the investigator to assemble them according to the research question. This requires some minor computer programming skills. The bulk of the assemblage is deciding which menus to retain, appropriate labeling through internal or external lists, setting timers and audio cues, installing maps if needed, and modifying data output locations. After making these decisions, the researcher organizes the codes, places appropriately labeled boxes on the GUI, and tests the error checking algorithm.


For at least a decade scholarly publications have reported an increase in autism, or more formally, the diagnosis of Autism Spectrum Disorder (ASD) [12]. Recent reviews suggest that observation data is critical for accurate diagnosis and evaluating treatment outcome [13, 14]. After reviewing this literature, to illustrate the versatility pyObs, I took the basic pyObs code structure, borrowed the interface from the Visual Attention variant, and modified it to capture joint attention – the coordinating and sharing of attention between an object and another person – an assumed critical component in identifying autism [15, 16, 17]. Individuals with ASD lack the aptitude for responding to gestural or physical cues that enhance social coordination.

With a few small computer code changes, I generated a simple ASD observation coding program, see Figure 8. This joint attention interface illustrates that pyObs contains the key components needed to build a specific observational coding instrument; it captures the target individual, the coder, the location, time references, the observed behavior, and data storage.

Figure 8 

Modified pyObs showing a relevant ASD joint attention collection GUI (OS X).