does learning from inspectionspublic.kenan-flagler.unc.edu/2017msom/sigs... · burden on...

38
Does Learning from Inspections Affect Environmental Performance? – Evidence from Unconventional Well Development in Pennsylvania Date: October 11, 2016 Abstract With the growing awareness that operations can affect the environment, regulators increasingly use facility inspections to assess a firm’s environmental performance: whether its operations comply with or violate environmental regulations. When operations violate regulations, firms can face regulatory sanctions for non-compliance and pressure from stakeholders to improve environmental performance. Consequently, firms need to develop organizational knowledge to ensure that their operations conform to regulations. Learning from past inspection experience is critical for the development of such knowledge. Using data on 11,039 unconventional wells developed in Pennsylvania from 2009 to 2014, we investigate how firms can learn from their inspection experience and from such experience of other firms. We find that an unconventional well learns from the inspection experience of other units both within the organization and outside the organization, only when inspections detect violations but not when they confirm compliance. Further, penalties imposed for violations have a divergent effect – they support learning from the inspection experience with violations when it is gained at other units within the organization, but not from such experience gained at units outside the organization. Our results provide insights on how the outcomes of environmental inspections and penalties facilitate the development of organizational knowledge. Keywords: Environmental Performance, Inspections, Organizational Learning, Penalties, Unconventional Well Development, Fracking.

Upload: others

Post on 22-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

Does Learning from Inspections Affect Environmental Performance? – Evidence from Unconventional Well Development

in Pennsylvania Date: October 11, 2016

Abstract

With the growing awareness that operations can affect the environment, regulators increasingly use facility inspections to assess a firm’s environmental performance: whether its operations comply with or violate environmental regulations. When operations violate regulations, firms can face regulatory sanctions for non-compliance and pressure from stakeholders to improve environmental performance. Consequently, firms need to develop organizational knowledge to ensure that their operations conform to regulations. Learning from past inspection experience is critical for the development of such knowledge. Using data on 11,039 unconventional wells developed in Pennsylvania from 2009 to 2014, we investigate how firms can learn from their inspection experience and from such experience of other firms. We find that an unconventional well learns from the inspection experience of other units both within the organization and outside the organization, only when inspections detect violations but not when they confirm compliance. Further, penalties imposed for violations have a divergent effect – they support learning from the inspection experience with violations when it is gained at other units within the organization, but not from such experience gained at units outside the organization. Our results provide insights on how the outcomes of environmental inspections and penalties facilitate the development of organizational knowledge.

Keywords: Environmental Performance, Inspections, Organizational Learning, Penalties, Unconventional Well Development, Fracking.

Page 2: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

1

1. Introduction Over the past several years there has been a growing recognition that manufacturing operations can affect

the environment. Correspondingly, many firms face significant pressure from their stakeholders (inves-

tors, employees, customers, etc.) to ensure that their operations conform to environmental regulations. An

important consequence of the increased environmental awareness is that manufacturing activities are sub-

ject to increased regulatory scrutiny. In the United States, the Environmental Protection Agency (EPA)

monitors compliance for 44 programs authorized by several statutes, such as the Clean Water Act, Clean

Air Act, Toxic Substances Control Act, etc. (EPA 2015). Environmental inspections are extensively uti-

lized in several of these programs to establish environmental compliance of manufacturing operations.

For instance, the EPA and it regulatory partners conducted 187,563 facility inspections, from 2009 to

2014, just towards ensuring compliance with the Clean Water Act (ECHO 2015). There are two broad

reasons why environmental inspections are integral to such compliance monitoring programs. The first

obvious reason is that environmental inspections can detect non-compliance and thus drive facilities to

comply with environmental regulations. Inspectors can visit facilities during regular production, collect

information, and review processes, which enables regulators to determine whether operations comply

with or violate environmental regulations. When operations are non-compliant, regulators can issue sanc-

tions that can move facilities towards compliance or initiate actions that can lead to suspension of opera-

tions. These aspects of inspections have been recognized in the literature and several papers have exam-

ined the role of environmental inspections in detecting violations (e.g., Dhanorkar et al. 2015a, Kim

2015). The second reason, which is not so obvious, is that environmental inspections can facilitate the

development of organizational knowledge which can enable organizations to adopt practices aligned with

environmental regulation. This is because when inspectors visit facilities they can highlight potential pit-

falls in operations, present details of solutions to problems, and provide information on best practices.

Moreover, firms can observe the processes that are being scrutinized and understand how they can lead to

environmental issues that get penalized or sanctioned. All these activities can help firms improve their

understanding of the regulatory requirements and inspection processes and thus enable ‘the development

of organizational knowledge’ (organizational learning), which can help firms to comply with environ-

mental regulation as well as avoid penalties or sanctions. Additionally, improvements in environmental

performance may reduce the frequency with which a facility gets inspected and thus lower the regulatory

burden on operations. However, to the best of knowledge, not much research has explored how environ-

mental inspections can facilitate organizational learning. Thus, a critical facet of inspections that not only

facilitates improved environmental outcomes for manufacturing operations but also has implications for

inspection activities remains relatively under studied in the operations management (OM) literature. Con-

Page 3: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

2

sequently, in this study we seek to address the following research questions: 1) How environmental in-

spections can enable firms to develop organizational knowledge and improve their environmental perfor-

mance? and 2) What factors govern the development of such organizational knowledge?

Since our work is at the intersection of research on organizational learning and inspections, we seek

to address open issues and add to the extant knowledge in these domains. First, we look at the organiza-

tional learning literature, which identifies that firms can improve their performance as they gain experi-

ence: by performing the same task repeatedly, such learning from prior production experience is known as

autonomous learning (e.g., Argote and Epple 1990), and by undertaking conscious actions, such learning

through targeted efforts is known as induced learning (e.g., Lapré et al. 2000, Nembhard and Tucker

2011). Recent work has sought to unpack the impact of experience (e.g., production experience, targeted

efforts) on organizational learning. Several studies have decomposed experience based on performance

outcomes into experience with ‘success’ (e.g., production of good parts, successful satellite launch) and

‘failure’ (e.g., production of defective parts, failed satellite launch). The lens of success and failure is im-

portant in the domain of environmental inspections because each inspection can result in two outcomes –

either the operations are compliant with environmental regulations (i.e., success) or they are in violation

of regulation (i.e., failure). Evidence indicates that firms can develop organizational knowledge from their

experience with success (e.g., Kim et al. 2009, Baum and Dahlin 2007) and failures (e.g., Haunschild and

Sullivan 2002, Kim and Miner 2007). Firms can also learn vicariously, from the success and failures ex-

perienced at other organizations (e.g., Kim and Miner 2007, Madsen and Desai 2010). Most of this litera-

ture on learning from success and failure is based on aggregate organizational experience, such as unusu-

ally strong performance experienced at commercial banks (Kim et al. 2009), successful or failed satellite

launches experienced at orbital vehicle launch organizations (Madsen and Desai 2010), train accidents

experienced at railroad firms (Baum and Dahlin 2007), etc. By contrast, environmental inspections are

undertaken at the facility level within firms, which means that experience with success and failure is de-

veloped at a more granular level within firms. Therefore, we can contribute to the literature by examining

how a focal unit develops knowledge from success and failure experienced at other units both within and

outside an organization. Thus, our work responds to the call of Madsen and Desai (2010, p. 472) for re-

search that studies the dissemination of knowledge gained from success and failure within organizations.

Next, we look at the inspection literature, which finds that inspection activity and the detection of vio-

lations is governed by a variety of factors, such as the reputation of the manufacturer (Macher et al.

2011), composition and experience of the inspection team (e.g., Macher et al. 2011, Short et al. 2015),

explicit sanctions (King and Lennox 2000), schedule of inspections (Kim 2015), penalties and incentives

(Porteus et al. 2015), to mention a few. Most of this literature takes the perspective of the regulators or

Page 4: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

3

inspecting organizations and investigates how inspections can identify or detect operations that are non-

compliant with regulations. However, what has been often overlooked in this literature is the perspective

of the firms that are being inspected. One exception is Anand et al. (2012); they study the pharmaceutical

industry where firms are required to adhere to manufacturing processes approved with the U.S. Food and

Drug Administration (FDA). They find that FDA inspections enable pharmaceutical firms to reverse de-

viations from the approved processes. Our paper builds on Anand et al. (2012), as we examine whether

firms develop knowledge (about environmentally friendly practices and regulatory requirements) as they

gain experience with inspections, which can enable them to better comply with regulations. Thus, we aim

to augment the inspection literature by providing a comprehensive picture of how inspections can facili-

tate the development of organizational knowledge and improve environmental performance. In this way,

we respond to the call of Gray and Shimshack (2011, p. 18) for research that examines how individual

units learn from environmental monitoring efforts.

This study uses data from the ‘Pennsylvania Department of Environmental Protection Office of Oil

and Gas Management (DEP)’ that monitors oil and gas well development in Pennsylvania (DEP 2014).

Pennsylvania has witnessed a boom in the development of unconventional wells, which use new produc-

tion techniques (i.e., horizontal drilling coupled with high-volume hydraulic fracturing, more commonly

known as fracking), to capture natural gas embedded within underground shale formations. Our data has

information on 11,039 unconventional wells that were developed across 40 counties from 2009 to 2014

by 113 operators. The DEP instituted a comprehensive environmental inspection program to ensure that

these wells comply with the environmental regulation (as outlined in Pennsylvania’s Oil and Gas Act). In

this period, 168 DEP inspectors conducted 53,718 environmental inspections at the unconventional wells;

of which 2,489 detected violations and in 440 of these instances monetary penalties were also levied. Ta-

ble 1 provides select examples of the violations that were detected. When violations are detected, well

operators are issued a notification of violation. For each violation, the well operators are required to de-

velop a corrective action to ensure the well site becomes compliant (within 180 days of detection). The

DEP records all information on inspections and violations within fourteen days in a public website.

We interacted with several key DEP members and unconventional well operators to better understand

the regulatory perspective and the industry responses. In an interview, a program specialist of the DEP

mentioned: “Many operators think they can easily meet the environmental requirements because they

have drilled and operated wells in Texas (… other states); but very soon they realize that the geology in

Pennsylvania is different – the sub-surface here is fractured down to 100 feet below the surface and the

waterways are in close proximity to drilling sites, which increases the possibility of environmental is-

sues.” This comment highlights the unique challenges of unconventional well development in Pennsylva-

Page 5: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

4

nia, due to which the DEP considers the inspection process as a vital tool to guide and direct operators. As

the DEP’s bureau director stated: “We see ourselves (DEP) as people that provide compliance assistance,

clarify expectations, and help the operators.” Additionally, our interactions with the unconventional well

operators highlighted that several informal mechanisms (e.g., vendor networks, social events, industry

meetings) facilitate the sharing of knowledge which enables them to better comply with regulation.

The development of a large number of unconventional wells in a short time period coupled with the

unique geology of Pennsylvania, meant that the industry adopted a new production technology rapidly

without the benefit of leveraging its experience from other geographies. As a result, the extensive number

of environmental inspections can play a critical role in helping organizations learn about making opera-

tions compliant. Thus, the above described setting is appealing to investigate how inspections can facili-

tate the development of organizational knowledge.

Our results show that a focal unit (we refer to an unconventional well as the focal unit) learns from

the experience with failure gained at other units within the organization. By contrast, it does not learn

from the experience with success gained at other units within the organization. Thus, we addresses an

open issue in the organization learning literature (see, Madsen and Desai 2010), by highlighting that

knowledge from failure disseminates across units within an organization. We also find that a focal unit

learns from the experience with failure gained at other units outside the organization, but not from the

experience with success. Finally, we find that penalties have a divergent effect on the impact of experi-

ence with failure. In the presence of penalties, a focal unit learns from the experience with failure when it

is gained at other units within the organization, but it does not learn from such experience when it gained

at other units outside the organization. Our results are in contrast to the environmental literature, which

finds that the impact of penalties spills over and induces improved performance at other organizations

(Shimshack and Ward 2005). Therefore, we contribute to the environmental literature by illustrating spe-

cific instances when penalties are not effective in improving environmental performance. Finally, our re-

sults are also relevant for regulators and operating managers because they provide insights on how inspec-

tions can affect environmental performance.

The rest of the paper is organized as follows. In section 2, we discuss the relevant literature. In section

3, we present our hypotheses. In section 4, we describe the data and the measures used in our analysis. In

section 5, we discuss our methodology. In section 6, we present our results. In section 7, we discuss the

implications of our findings and the limitations of our analysis.

2. Literature Review

Page 6: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

5

Our work draws on and contributes to the literatures on organizational learning from success and failure,

inspections, and environmental regulations. We provide a broad overview of the relevant literature in the-

se areas, discuss key differences in our study, but defer a more detailed discussion to the next section.

Since the earliest observation of learning effects, scholars have sought to understand how various fac-

ets of prior experience contribute to the development of organizational knowledge (e.g., Darr et al. 1995,

Banker et al. 2001, Argote 2013). A key dimension of prior experience that has received much attention

in the literature is whether prior experience with success or failure can facilitate learning (e.g., Madsen

and Desai 2010, Baum and Dahlin 2007). This literature mainly examines aggregate organizational expe-

rience with success or failure. It finds that, at an organizational level, firms learn from both prior experi-

ence with success (e.g., Baum and Dahlin 2007, Kim et al. 2009) and prior experience with failures (e.g.,

Haunschild and Sullivan 2002, Thirumalai and Sinha 2011), across a variety of settings. However, Levitt

and March (1988) point out that organizations are a collection of units learning in an environment that

include other multi-unit organizations. This distinction becomes important with the prevalence of the mul-

ti-unit organizational form, which makes it essential to understand how experience acquired at the focal

unit and at other units can contribute to the development of organizational knowledge (Argote and Miron-

Spektor 2011). Therefore, our work adds to the literature by exploring how a focal unit can learn from the

success and failure experienced at other units either within the organization or outside the organization. A

notable study that delves deeper within an organization to examine learning at the individual level is KC

et al. (2013); they examine how individual cardiac surgeons learn from their experience with success and

others experience with failure. While learning at the individual level is an important element of organiza-

tional learning, Argote and Miron-Spektor (2011) point out that such learning may not always translate to

group or organizational learning. Therefore, our focus on learning at the level of the organizational unit

constitutes an important piece to understand how organizations learn from prior experience with success

and failure. Additionally, we undertake a deeper examination of the experience with failure because our

setting allows us to investigate how penalties affect learning, which has not been explored in prior work.

The literature on inspections has broadly explored two facets. The first facet examines factors that af-

fect inspection activity and outcomes. Short et al. (2015) examine how the composition of the inspection

team affects the detection of violations. They find that more violations are detected when teams have

women inspectors, are more trained, and have members with longer tenure. Macher et al. (2011) examine

how the reputation of manufacturers affects inspection activity. They find that when manufacturers have a

prior reputation of not complying with stipulated processes, regulators inspect such manufacturers more

frequently and identify more violations in the inspections. Ball et al. (2015) study the medical product

industry and find that as inspectors develop plant specific experience there is an increase in product re-

Page 7: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

6

calls. The emphasis in this facet has been to provide a deeper understanding of the factors that facilitate or

hinder the detection of violations or deviations from stipulated processes. The second facet explores the

impact of inspection activity on operational outcomes. Studies find that explicit sanctions or penalties for

adverse inspection outcomes drive firms to improve their performance (e.g., King and Lenox 2000, Poto-

ski and Prakash 2011). In the pharmaceutical industry, scholars have shown that inspections can serve as

external shocks that help manufacturing facilities reverse the deterioration of processes established with

the FDA (Anand et al. 2012). In contrast to these facets of the inspection literature, we focus on under-

standing whether organizations develop knowledge as they gain experience with inspections. Additional-

ly, we highlight that organizations can improve their environmental performance not only from their di-

rect inspection experience but also from the indirect inspection experience gained at other organizations.

The OM literature recognizes that environmental issues and operations are typically intertwined (e.g.,

Corbett and Klassen 2006, Plambeck 2013), and finds that adopting an environmental perspective can

affect a firm’s stock market performance (e.g., Klassen and McLaughlin 1996, Jacobs et al. 2010) as well

as impact its operations (e.g., Rajaram and Corbett 2002). The importance of the link between environ-

mental issues and operations is reflected in the growing body of empirical work that investigates the in-

terplay of environmental regulations and operations across a variety of domains, such as industrial manu-

facturing (Fu et al. 2015), supplier management (Porteus et al. 2015), and waste management (Dhanorkar

et al. 2015b). The dominant ideas that have been developed in this literature include how the framework

of regulation (e.g., Kroes et al. 2012), incentives and penalties (e.g., Porteus et al. 2015), operational

leanness (Fu et al. 2015), and punitive and supportive tactics (Dhanorkar et al. 2015a) affect environmen-

tal as well as operational outcomes. We contribute to this literature by investigating how environmental

inspections (an important tool in monitoring compliance with environmental regulations) can help firms

develop knowledge required to improve their environmental performance.

In summary, this research contributes to the literature by examining how various facets of inspection

experience lead to the development of organizational knowledge required to improve environmental

performance. Next, we discuss the literature in detail as we formulate our hypotheses.

3. Hypotheses The focus of our study is to examine how a focal unit can learn from the various facets of inspection

experience gained at other units both within and outside the organization. Consequently, we leverage the

literatures on organizational learning and on learning from success and failure to understand how

environmental inspections can enable the development of organizational knowledge.

3.1 Learning from Inspection Experience of Others: Within and Outside the Organization

Page 8: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

7

Organizational learning refers to the idea that organizations can improve their performance as they gain

experience. Wright (1936) identified that unit costs in airframe production decreased at a constant rate

with the doubling of cumulative production (i.e., production experience). Since then the literature has

used cumulative number of task performances as a measure of organizational experience and explored

how such experience can affect various aspects of organizational performance. Scholars have shown that

organizations can: reduce their production costs with cumulative production (e.g., Argote et al. 1990, Darr

et al. 1995), improve their quality performance with cumulative number of quality projects (e.g., Lapré et

al. 2000), increase patient survival rates with cumulative number of deliberate learning activities (e.g.,

Nembhard and Tucker 2011), etc. Overall, the literature finds that as organizations gain experience they

develop knowledge which enables them to improve their performance across a variety of dimensions.

In a similar manner, inspections can uncover environmental issues which can help organizations to

improve their environmental performance. Once a firm realizes that a facility has potential problems, it

can focus on identifying the underlying causes, exploring potential solutions, and implementing relevant

remedies to address the environmental issues. Additionally, inspections identify facilities that conform to

regulations, which allows a firm to distinguish practices that meet environmental requirements. Thus,

when a firm undergoes multiple environmental inspections (i.e., as it gains inspection experience), it can

develop organizational knowledge that enables it to improve its environmental outcomes.

In our setting, operators (i.e., firms) typically own several unconventional wells. At any given

instance, the processes followed at a specific unconventional well will reflect the collective organizational

knowledge developed at the operator. This is because research shows that units within an organization can

learn from the experiences of other units in the organization (e.g., Epple et al. 1996, Darr et al. 1995,

Baum and Ingram 1998). Consequently, a given unconventional well (i.e., a focal unit) will benefit from

the inspection experience gained at the other units of the operator. Thus, we hypothesize:

Hypothesis 1a: A focal unit will improve its environmental performance with the inspection experience

gained at other units within the same organization.

When a firm develops organizational knowledge to improve its environmental performance, some of

this knowledge can benefit other firms. This is because typically knowledge is not an appropriable com-

modity (Arrow 1963); if one firm uses an idea it does not preclude others from learning about it or using

it. Consistent with this notion the literature finds that organizations can learn vicariously from the experi-

ence of other organizations (e.g., Beckman and Haunschild 2002, Knott et al. 2009). Additionally, re-

search indicates that regional proximity facilitates knowledge flows across firm boundaries (e.g., Shaver

and Flyer 2000, Alcácer and Chung 2007). In Pennsylvania, local authorities of a county control several

activities of operators related to unconventional well development (e.g., placement of drilling rigs, waste

Page 9: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

8

pits, pipelines, etc.), which means that operators within a county face a similar governing structure.

Moreover, the DEP inspectors typically operate within a county. Therefore, within a county, we expect a

focal unit will benefit from the inspection experience gained by unconventional wells owned by other op-

erators. Based on the above discussion, we propose the following hypothesis:

Hypothesis 1b: A focal unit will improve its environmental performance with the inspection experience

gained at other units outside the organization.

H1a and H1b are consistent with the literature on organizational learning. In the following hypothe-

ses, we seek to extend the literature by unpacking the impact of inspection experience.

3.2 Learning within the Organization from Inspection Outcomes: Success and Failure

Recent work recognizes that experience can be characterized at a finer level along distinct dimensions

that can have a differing impact on organizational learning (Argote et al. 2003). One dimension that has

gained increased traction is whether organizations learn from successes and failures. The behavioral

theory of the firm indicates that organizations respond differently to success and failure (Cyert and March

1963). Success provides organizations confirmatory evidence that the current actions are effective in

accomplishing organizational tasks. In such a state of the world, the natural focus of organizations will be

on maintaining and consolidating the existing knowledge that produced success. With repeated success,

organizations can develop a better understanding on how various factors interact to achieve desired

organizational outcomes. Thus, organizations can learn and refine their existing knowledge to ensure

successful task accomplishment. Indeed, several studies find that organizations learn from successful

experience (e.g., Baum and Dahlin 2007, KC et al. 2013). The concept of success can be used to charac-

terize inspection experience because inspections can confirm that operations comply with environmental

regulations (i.e., success). Organizations can develop knowledge with successful inspection experience

because such inspections provide confirmatory evidence that the current processes in the facility meet

environmental requirements. Thus, when an organization experiences successful inspections across its

facilities, it develops a better picture of how to manage its operations to meet environmental regulations.

By contrast, failure indicates that existing organizational knowledge is insufficient for effective task

accomplishment. By highlighting what does not work, failure can facilitate organizational learning in two

ways. First, failure can motivate organizations to question existing assumptions and beliefs, which can

facilitate the search to identify new and improved ways of working for effective task accomplishment.

Second, failure often provides pointers on the specific areas that need improvement. Hence, it can direct

the organizational search efforts to the most beneficial areas (Levinthal and March 1981). Several studies

have shown that organizations learn from failure (e.g., Haunschild and Sullivan 2002, Kim and Miner

Page 10: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

9

2007). The concept of failure can also be used to characterize inspection experience because inspections

can identify when operations do not comply with or violate regulation (i.e., failure). When inspections

determine that an organization’s operations are non-compliant with environmental regulations they reveal

the inadequacies in current operations. Such inspections can accelerate the organization’s efforts to find

and implement solutions to address the relevant environmental issues. As an organization faces more in-

spections that identify violations, it develops a better understanding of how to organize its operations to

avoid environmental issues. Thus, an organization can develop knowledge as it gains experience with in-

spections that identify violations.

Failures are typically endowed with a sense of urgency (Madsen and Desai 2010); organizations need

to fix what is not working. Regulators could impose sanctions or even mandate stoppage of work, if the

violations identified in environmental inspections are not resolved. Therefore, after a failure, organiza-

tions are not only more receptive to new ideas but they are also able to quickly implement the relevant

solutions. As a result, in the context of environmental issues, failures may have a larger impact on the de-

velopment of organizational knowledge than success. Interestingly, even in some OM settings scholars

find that failure contributes more to organizational learning than success (e.g., Li and Rajagoplan 1997).

So far, we explored how an organization can develop knowledge from the experience with success

and failure. Now, we examine whether a focal unit can leverage this knowledge developed across the

units within an organization. There are three broad reasons that will support the transfer of knowledge

from the larger organization to a focal unit. The first reason is that organizational units often have similar

motivations, have comparable structures, and are not direct competitors. This facilitates the development

of personal relationships and the opportunities to communicate across the organizational units, and these

factors can enable knowledge flows across organizational units (Argote 2013). The second reason is that

organizational knowledge is often embedded in routines and templates. Such routines and templates can

be replicated across organizational units to facilitate knowledge transfer (e.g., Knott 2001, Winter et al.

2012). The third reason is that improvements to equipment and technology to improve organizational out-

comes can be easily adopted across organizational units (e.g., Epple et al. 1996, Argote and Darr 2000).

Thus, we expect that a focal unit will be able to leverage the knowledge developed at other units in the

organization and improve its environmental performance. Based on these considerations, we hypothesize:

Hypothesis 2a: A focal unit will improve its environmental performance as other units within the organi-

zation gain inspection experience with success or failure. Inspection experience with failure will have a

higher impact on the focal unit’s environmental performance than inspection experience with success.

The broad consensus in the environmental literature is that when penalties or explicit sanctions are

imposed for violations, organizations redouble their efforts to identify and implement solutions for the

Page 11: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

10

environmental problems, and as a result adherence to regulation improves (e.g., King and Lenox 2000,

Shimshack and Ward 2005). As a result, we expect a focal unit will benefit more from the experience

with failure of other units within the organization when penalties are imposed. Thus, we hypothesize:

Hypothesis 2b: A focal unit will improve its environmental performance as other units within the organi-

zation gain experience with inspections that identify violations and include penalties. Such experience

will have a higher impact on the focal unit’s environmental performance than the inspection experience

with failure or success.

3.3 Learning from Other Organizations’ Inspection Outcomes: Success and Failure

When a manufacturing facility develops knowledge with successful inspection experience, such

knowledge tends to be retained in the reservoirs of organizational knowledge; in production settings, such

reservoirs typically include the technology, routines, and members, of manufacturing facilities (e.g., Ar-

gote and Ingram 2000, Agrawal and Muthulingam 2015). Although it is difficult for a firm to directly ac-

cess another firm’s reservoirs of knowledge, the literature finds that a firm can access the knowledge de-

veloped at other firms; by using similar equipment (e.g., Cheng and Nault 2007), by implementing similar

routines (e.g., Andristos and Tang 2014), and by recruiting new employees (e.g., Singh and Agrawal

2011). However, research shows that such vicarious learning from others experience is limited by

geographical proximity (e.g., Kim and Miner 2007). Thus, a focal unit will benefit when other units

outside the organization, but in close proximity (e.g., within the county), develop knowledge with

successful inspection experience.

Similarly, we can expect that a focal unit will also develop knowledge as other units outside the or-

ganization gain experience with inspection that detect violations. However, several reasons suggest that

learning from others experience with failure will be greater than the learning from others experience with

success. First, failure may lead to the dispersion of organizational knowledge because firms are often

willing to share the knowledge that led to failure (e.g., Kim and Miner 2007), which makes this

knowledge more accessible to others. Second, even when firms are reluctant to share details of their fail-

ure, external agencies could divulge this information across the industry. For example, the ECHO data-

base of the EPA makes information on inspection failures available to the general public through its web-

site (ECHO 2015). Similarly, in our setting, the DEP makes information on inspections and violations

available to everyone through its website. Moreover, in the context of inspections, the associated inspec-

tors can disseminate detailed information on the failure when they visit other facilities, which can aid the

vicarious learning at a focal unit. Finally, observing that other firms have failed may prompt firms to re-

view and improve their processes in order to avoid similar outcomes (e.g., Baum & Dahlin 2007), but ob-

serving others’ success may not trigger such activities (e.g., Madsen and Desai 2010). Consequently, we

Page 12: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

11

expect that a focal unit will learn more from others’ inspection experience with failure than from others’

inspection experience with success. Based on the above considerations we hypothesize:

Hypothesis 3a: A focal unit will improve its environmental performance when units in other organiza-

tions gain inspection experience with success or failure. Such inspection experience with failure will have

a higher impact on the focal unit’s environmental performance than inspection experience with success.

Shimshack and Ward (2005) find that when a regulator imposes penalties on a facility, it has a spillo-

ver effect that leads to lower violations at neighboring facilities. They reason that imposing penalties sig-

nals the regulator’s willingness to act on environmental violations, which deters violations at neighboring

facilities within the regulatory jurisdiction. As a result, we expect a focal unit will benefit more from oth-

er organizations’ inspection experience with failure, when penalties are imposed. Thus, we hypothesize:

Hypothesis 3b: A focal unit will improve its environmental performance as units with other organiza-

tions gain experience with inspections that identify violations and include penalties. Such inspection ex-

perience will have a higher impact on the focal unit’s environmental performance than the inspection ex-

perience with failure or success.

4. Data and Measures

4.1 Research Setting Pennsylvania has significant reserves of natural gas, estimated at around 489 trillion cubic feet (Engelder

2009), which are trapped within sub-surface shale formations, known as the Marcellus and Utica Shale.

The development of unconventional wells enabled the exploitation of these reserves. By 2014, Pennsyl-

vania was the second largest supplier of natural gas in the USA with production in excess of 4 trillion cu-

bic feet (DEP 2014). The DEP is the primary agency that ensures the development and operations of un-

conventional wells conform to environmental regulations (DEP 2014). In this capacity, the DEP collects

information on the drilling and production activities from the operators for each unconventional well.

The DEP has developed broad guidelines which suggest that an unconventional well can be inspect-

ed: 1) just prior to the start of drilling at a new site, 2) after stimulation activities when the well is opera-

tional, 3) subsequent to any alteration, modification, or repairs, 4) before a well becomes inactive, and 5)

following any complaint or incident. Inspectors from the DEP monitor the unconventional wells across

the state. The DEP has decentralized the management of inspection activity and inspectors have the flexi-

bility to plan the inspection schedule based on their travel program and their knowledge of operator ac-

tivity within the county. A typical inspection is conducted by one inspector within a day. The inspection

involves: 1) review of operational records of the unconventional well to ensure that all relevant permits

Page 13: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

12

and approvals have been obtained, 2) measurements from the production equipment to validate the

equipment is functioning properly, 3) collection of samples for further analysis to ensure that there are no

unwanted leakages or migration of chemicals, and 4) visual observation of site operations and site condi-

tions, to check that there is no environmental damage at the site. We also participated as observers in

many environmental inspections conducted by the DEP to better understand the inspection process.

There are three broad types of inspections, which involve: 1) routine review of operations, 2) review

of issues raised in complaints or in response to incidents, and 3) follow-up review from prior inspector

visits. The inspections determine whether a well site complies with or violates environmental regulations.

At the end of the inspection, the inspectors discuss their findings with the personnel and managers at the

well site. Subsequently, an inspection report is prepared and shared with them. The DEP issues non-

compliant wells with a notification of violation and in some instances, monetary penalties are imposed for

the violations identified through inspection. The details from the inspection visits and the compliance

tracking activities are entered into eFACTS (Pennsylvania’s Environment Facility Application Compli-

ance Tracking System), which allow the industry players’ and the general public access to the information

and results from the inspection process.

It must be noted that unconventional well development has faced significant opposition in Pennsylva-

nia. Several not-for-profit organizations (e.g., Americans Against Fracking, PennEnvironment, Environ-

ment America) actively seek to highlight environmental violations of well operators. The state has also

witnessed significant legislative activity (e.g., CLEANER Act) against unconventional drilling. Moreo-

ver, unconventional well development is banned in the contiguous state of New York even though it has

proven natural gas reserves. Our interactions indicate that the industry recognizes these issues, which also

prompts the operators to minimize environmental violations.

For the period 2009 to 2014, we collected the following information from the DEP (available at

www.dep.state.pa.us) for each unconventional well: 1) Operator name and location of the well (latitude,

longitude, municipality, and county) and 2) Gas and oil production reported for the time period. Separate-

ly, we collected the following information on the inspection data for each unconventional well: 1) inspec-

tion date, 2) the type of inspection (routine inspection or inspection in response to an incident or com-

plaint), 3) inspector who performed the inspection, 4) violations, and 5) monetary penalties. We used the

above information to create a comprehensive dataset, which has inspection and production data for each

unconventional well, and constitutes the data used in our empirical analyses. Tables 2 and 3 provide the

descriptive statistics and correlations for our data. From 2009 to 2014, the proportion of inspections that

identify violations has declined steadily, as shown in Figure 1.

4.2 Measures Used for the Analysis

Page 14: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

13

The main variables in our analysis relate to measures of environmental performance and inspection expe-

rience. Next, we describe these variables and the additional controls used in our analysis.

4.2.1 Dependent Variable: Environmental Performance

Violation (!"#$%,') – We use inspection outcomes to measure the environmental performance of an un-

conventional well. The logic is that when operations comply with environmental regulations, inspections

will not find any violations, however when operations do not comply with regulations, inspections will

identify violations. Consequently, !"#$%,' for an unconventional well (, for operator ), in county *, that

was inspected by inspector + in time period , takes a value of 1 if the inspection detected any violations

and 0 otherwise.

4.2.2 Variables for Hypotheses 1a–1b: Inspection Experience Within and Outside the Organization

For each unconventional well (, we define variables to represent the inspection experience gained at other

units within the organization (i.e., other unconventional wells of the operator) and outside the organiza-

tion (i.e., unconventional wells of other operators in the county). Our experience variables are defined as

cumulative counts of inspections, in line with Lapré et al. (2000) and Nembhard and Tucker (2011). Thus:

Cumulative Inspection-Operator (-."#, '/012 ) –This is the cumulative count of inspections done at

other unconventional wells of the operator. This variable excludes the inspection experience of the focal

unit and is calculated as: -."#, '/012 = (45#$%,6789

:;<%,$,5=" , where (45#$%,6 is 1 if unconventional well ?

of operator ) was inspected by inspector + in period @, and is 0 otherwise. In our regression models (in §

5.1), if the coefficient of this variable is negative and significant, then we infer that inspection experience

gained at other units of the operator contributes to improved environmental performance. (Note: similar

logic will be used to assess the impact of the experience variables defined from now onwards.)

Cumulative Inspection-County (-."$, '/0A' ) – This is the cumulative count of inspections done at

unconventional wells that belong to other operators in the county and represents the experience gained at

other units outside the organization. This variable excludes the inspection experience of the focal operator

in the county, and is calculated as: -."$, '/0A' = (4?B*+,@,−1@=0+,B≠),?≠( , where (45G$%,6 is 1 if

unconventional well ? of operator B was inspected by inspector + in period @, and is 0 otherwise.

4.2.3 Variables for Hypotheses 2a–2b: Types of Indirect Inspection Experience in the Organization

H2a and H2b examine learning from the inspection outcomes at other units within an organization. To

evaluate H2a, we identify whether each inspection resulted in a success (i.e., no violations were observed)

or a failure (i.e., violations were identified). This allows us to decompose the operator experience

-."#, '/012 into two components (-.I"#, '/0

12 , -.J"#, '/012 ) to form the variables:

Page 15: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

14

Cumulative Success-Operator (-.I"#, '/012 ) – This is the cumulative count of all inspections without

violations done at other unconventional wells of the operator. It is calculated as: -.I"#, '/012 =

(4L5#$%,6789:;<%,$,5=" , where (4L5#$%,6 is 1 if violations were not detected when unconventional well?of

operator)was inspected by inspector+in period@,and is0 otherwise.

Cumulative Failure-Operator (-.J"#, '/012 ) – This is the cumulative count of all inspections that de-

tected violations done at other unconventional wells of the operator. It is calculated as: -.J"#, '/012 =

(4M5#$%,6789:;<%,$,5=" , where (4M5#$%,6 is 1 if violations were detected when unconventional well ? of

operator)was inspected by inspector+in period@,and is0 otherwise.

To evaluate H2b, we identify whether penalties were imposed when an inspection identified viola-

tions. This allows us to decompose the measure for experience with failure -.J"#, '/012 into two compo-

nents (-.N"#, '/012 , -.N."#, '/0

12 ) to form the variables:

Cumulative Penalty-Operator (-.N"#, '/012 ) – This is the cumulative count of all inspections done at

other unconventional wells of the operator when the DEP observed violations and imposed monetary pen-

alties. It is calculated as: -.N"#, '/012 = (4O5#$%,6789

:;<%,$,5=" , where (4O5#$%,6 is 1 if violations were

detected and monetary penalties were imposed when unconventional well?of operator)was inspected

by inspector+in period@ ,and is0 otherwise.

Cumulative Penalty No-Operator (-.N."#, '/012 ) – This is the cumulative count of all inspections

done at other unconventional wells of the operator when the DEP observed violations but did not impose

any monetary penalties. It is calculated as: -.N."#, '/012 = -.J"#, '/0

12 − -.N"#, '/012 .

4.2.4 Variables for Hypotheses 3a–3b: Types of Inspection Experience Outside the Organization

To evaluate H3a and H3b, we define four variables that compute the different types of inspection experi-

ence gained at unconventional wells of other operators within a county. The approach is analogous to the

one used to develop the independent variables for H2a and H2b. This results in the following variables:

Cumulative Success-County (-.I"$, '/0A' ) – This is the cumulative count of all inspections without vi-

olations done at unconventional wells of other operators in the county. It is calculated as: -.I"$, '/0A' =(4L5G$%,6789

:;<%,G=#,5=" , where (4L5G$%,6 is 1 if violations were not detected when unconventional well?of operatorBwas inspected by inspector+in period@,and is0 otherwise.

Cumulative Failure-County (-.J"$, '/0A' ) – This is the cumulative count of all inspections that detect-

ed violations done at unconventional wells of other operators in the county. It is calculated as:

Page 16: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

15

-.J"$, '/0A' = (4M5G$%,6789:;<%,G=#,5=" , where (4M5G$%,6 is 1 if violations were detected when unconven-

tional well?of operatorBwas inspected by inspector+in period@,and is0 otherwise.

Cumulative Penalty-County (-.N"$, '/0A' ) – This is the cumulative count of all inspections done at un-

conventional wells of other operators in the county when the DEP observed violations and imposed

monetary penalties. It is calculated as: -.N"$, '/0A' = (4O5G$%,6789:;<%,G=#,5=" , where (4O5G$%,6 is 1 if

violations were detected and monetary penalties were imposed when unconventional well?of operatorBwas inspected by inspector+in period@ ,and is0 otherwise.

Cumulative Penalty No-County (-.N."$, '/0A' ) – This is the cumulative count of all inspections done

at other unconventional wells of the operator when the DEP observed violations but did not impose any

monetary penalties. It is calculated as: -.N."$, '/0A' = -.J"$, '/0A' − -.N"$, '/0A' .

4.2.5 Controls for the Inspection and Production Experience Gained at the Focal Unit

We use the following variables to control for the inspection experience of the focal unconventional well:

Cumulative Inspection-Well (-.",('/0)) – Measures cumulative inspections at the focal well.

Cumulative Success-Well (-.I",('/0)) – Measures cumulative inspections without violations.

Cumulative Failure-Well (-.J",('/0)) – Measures cumulative inspections with violations.

Cumulative Penalty-Well (-.N",('/0)) –Measures cumulative inspections with penalties.

Cumulative Penalty No-Well (-.N.",('/0)) – Measures cumulative inspections with violations but

without penalties.

We control for the production experience of the focal unconventional well using a vector

PQRSTUVWRX",('/0) that includes two variables – Cumulative Gas (YZ",('/0)) and Cumulative Oil

(Y[",('/0)), which measure the cumulative volume of natural gas (in millions cubic feet) and the cumula-

tive volume of oil (in thousand barrels), respectively, produced by focal unit ( till period (, − 1).

4.2.6 Controls for the Prior Inspection Record of a Focal Unit

We follow Macher et al. (2011) and control for the reputation that an unconventional well develops with

the DEP based on its prior inspection outcomes. We define a vector \]^TV_VWRX" '/0 ={Za" '/0 , ba" '/0 , ca"('/0)} that includes three indicator variables : (1) Good Reputation (Za"('/0)) –

Indicates last two inspections had no violations; (2) Bad Reputation (ba"('/0)) – Indicates last two in-

spections had violations; (3) Deteriorating Reputation(ca" '/0 ) – Indicates last inspection found viola-

tions though the one before that had no violations.

Past inspection features may affect inspection activity (Macher et al. 2011). We control for the past

inspection features using a vector d]_VTQ]",('/0) that includes the following two indicator variables:

Page 17: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

16

Last Inspection-Penalty (Ne4f@,g",('/0))- shows last inspection resulted in a penalty.

Last Inspection-Complaint (Y)+O",('/0))- shows last inspection responded to a complaint or incident.

Research finds that the time since last inspection can affect inspections outcomes (Anand et al. 2012);

we control for this with cIh-", '/0 , which denotes the number of days since last DEP inspection.

We control for the rate of violations (i.e., ratio of number of violations to inspections) observed in the

prior month at the operator and the county, using a vector i\#$,('/0) that includes the Prior Rate of Vio-

lations at Operator (!a[#,('/0)) and Prior Rate of Violations at County (!aY$,('/0)). 4.2.7 Fixed Effects

We use a vector of fixed effects j%#$,' = {IN%, [N#, Yk$, la'} that includes the Inspector (IN%), Opera-

tor ([N#), County (Yk$), and Year (la') fixed effects. Inspector fixed effects control for the unobserved

heterogeneity in inspector expertise such as (i) difference in the level of training, (ii) familiarity with the

well location, and (iii) other specific inspector characteristics. Operator fixed effects control for the unob-

served heterogeneity in operators such as (i) size of the firm, (ii) type of firm (public/private), (iii) firm

level expertise and, (iv) production technology. County fixed effects control for county-specific time-

invariant factors such as (i) the geology of the area, which could impact the operations of an unconven-

tional well, (ii) the water resources, which includes the rivers and stream system that can affect environ-

mental management of the unconventional-well-ways, and (iii) the gas pipeline network which is required

to transport the gas from the production sites. Year fixed effects control for factors that change over time,

such as (i) technology, (ii) change in protocols, and (iii) change in local government structure.

5. Methodology We start by examining whether an unconventional well learns from inspection experience gained at other

units both within and outside the organization. Then, we study how the various facets of such experience

facilitate organizational learning. All our analyses were done using STATA (version 14.1).

Several scholars find that organizations can learn from targeted efforts (induced learning) (e.g., Lapré

et al. 2000, Nembhard and Tucker 2011), which suggests that an unconventional well can learn to im-

prove its environmental performance as it gains experience with environmental inspections. Consequent-

ly, we can represent an unconventional well’s environmental performance (!∗) as a function of inspection

experience (-.). However, in order to econometrically model this relation, we need to address two issues

pertinent to our setting. The first relates to the fact that the DEP determines the frequency with which an

unconventional well gets inspected. The DEP may select some wells for greater scrutiny based on their

past environmental performance or prior inspection history, which means that these wells could get more

experience with inspections than other wells. Thus, in our econometric models we need to account for this

Page 18: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

17

selection bias. The second relates to the fact that environmental performance (!∗) is latent, we only ob-

serve whether an inspection identified any violations (!"#$%,' = 1) or not (!"#$%,' = 0). To address these

two issues, we use probit models with sample selection (Van de Ven and Van Pragg 1981, Greene 2008)

that builds on the Heckman’s (1979) sample-selection model. In brief, this procedure involves the joint

maximum-likelihood estimation of: 1) a selection equation that identifies when the dependent variable is

observed; and 2) an outcome equation that identifies the binary outcome of the dependent variable. We

implemented the procedure using the “heckprobit” routine in STATA.

In our setting, the selection equation identifies whether the DEP selects to inspect an unconventional

well in a given time period. We use the following specification for the selection equation:

ne@@_-.",' = p" + r0\]^TV_VWRX" '/0 + rsd]_VTQ]" '/0 + rti\#$ '/0 + puYk$ + pula'+v"#$,' (1)

In the above equation: ne@@_-.",'indicates whether unconventional well ( was selected for inspection in

timeperiod , (each calendar day constitutes a period) or not; the other variables are as described in §4.2.6;

Yk$ represents the county fixed effects; la' represents time fixed effects; and v"#$,' represents the error

term. The selection equation is common across all the specifications of the outcome equations.

The outcome equation identifies whether the inspection detected any violations or not. We use three

outcome equations to evaluate our hypotheses. Next, we discuss the relevant econometric specifications.

5.1 Model to Study Learning from Inspection Experience gained at Other Units First, we assess whether a focal unit improves its environmental performance with inspection experience

gained at other units within the organization (i.e., operator) and from inspection experience gained at oth-

er units outside the organization (i.e., county). To do so, we represent the environmental performance of a

focal unit as a function of the inspection experience gained at the operator (-."#, '/012 ) and county

(-."$, '/0A' ) with the following specification:

!"#$%,' = w" + x0-."#, '/012 + xs-."$, '/0A' + xt-.", '/0 + y0PQRSTUVWRX", '/0 + ys\]^TV_VWRX", '/0 +

ytd]_VTQ]" '/0 + yzi\#$ '/0 + {uDSLI",('/0) + yÄÅÇRU,V + É"#$%,' (2)

The terms used in specification (2) are as defined in §4, and É"#$%,' represents the error terms. Here, x0

and xs denote the learning from inspection experience gained at other units within the organization and

outside the organization, respectively. If inspection experience gained at the operator and within the coun-

ty improve environmental performance, then these coefficients (x0, xs) will be significant and negative.

We estimate specifications (1) and (2) jointly using maximum-likelihood. In all our analyses, we cluster

standard errors by each unconventional well to account for potential serial correlation (Wooldridge 2002).

These results are shown in columns (S1) and (O1), respectively, of Table 4.

5.2 Model to Study Learning from Experience with Success and Failure of Other Units

Page 19: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

18

Now, we assess whether a focal unit benefits from inspection experience with success and failure gained

at other units of the operator and from such experience gained at other units in the county. We modify

specification (2) to include variables for cumulative success-operator (-.I"#, '/012 ), cumulative failure-

operator (-.J"#, '/012 ), cumulative success-county -.I"$, '/0A' , and cumulative failure-county

-.J"$, '/0A' to obtain the following specification:

!"#$%,' = w" + Ñ0-.I"#, '/012 + Ñs-.J"#, '/0

12 + Ñt-.I"$, '/0A' + Ñz-.J"$, '/0A' + Ñu-.I", '/0 +

ÑÖ-.J", '/0 + y0PQRSTUVWRX", '/0 + ys\]^TV_VWRX", '/0 + ytd]_VTQ]" '/0 + yzi\#$ '/0 +

{uDSLI",('/0) + yÄÅÇRU,V + É"#$%,' (3)

In specification (3), if experience gained with success and failure at other units of an operator improves

the environmental performance of an unconventional well, then we expect the coefficients of experience

with success and failure (Ñ0, Ñs) to be significant and negative. If experience gained with success and

failure at units of other operators in a county improves the environmental performance of an unconven-

tional well, then we expect the coefficients of experience with success and failure at the county (Ñt, Ñz) to be significant and negative. We estimate specifications (1) and (3) jointly using maximum-likelihood

and standard errors clustered by each unconventional well. These results are shown in columns (S2) and

(O2), respectively, of Table 4.

5.3 Model to Study the Impact of Penalties on Learning from Inspection Experience To estimate the impact of monetary penalties imposed at an operator and monetary penalties collected at

the county level, we modify specification (3) to include the variables for cumulative penalty-operator

(-.N"#, '/012 ), cumulative penalty no-operator (-.N."#, '/0

12 ), cumulative penalty-county (-.N"$, '/0A' ),

and cumulative penalty no-county (-.N."$, '/0A' ) to obtain the following specification:

!"#$%,' = w" + Ñ0-.I"#, '/012 + ÑÜ-.N"#, '/0

12 + Ñá-.N."#, '/012 + Ñt-.I"$, '/0A' + Ñà-.N"$, '/0A' +

Ñ0â-.N."$, '/0A' + Ñu-.I", '/0 + Ñ00-.N", '/0 + Ñ0s-.N.", '/0 + y0PQRSTUVWRX", '/0 +

ys\]^TV_VWRX", '/0 + ytd]_VTQ]" '/0 + yzi\#$ '/0 + {uDSLI",('/0) + yÄÅÇRU,V + É"#$%,' (4)

If monetary penalties imposed at an operator contribute to learning then the coefficient (ÑÜ) in specifica-

tion (4) will be significant and negative. If monetary penalties imposed at other units in the county con-

tribute to learning then the coefficient (Ñà) in specification (4) will be significant and negative. We esti-

mate specifications (1) and (4) jointly using maximum-likelihood and standard errors clustered by each

unconventional well. These results are shown in columns (S3) and (O3), respectively, of Table 4.

6. Results

Page 20: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

19

In this section, we present our results with the associated impact and discuss the various robustness tests.

6.1 Results with Associated Impact

We start by examining the estimation results for the selection equations in columns (S1, S2, and S3) in

Table 4. We observe that the coefficients of the reputation related variables (i.e., Good Reputation, Bad

Reputation, and Deteriorating Reputation) have the expected signs and are significant (p<0.05). This indi-

cates that the DEP increases inspection activity for unconventional wells with Bad or Deteriorating Repu-

tation and reduces inspection activity for wells with Good Reputation; these findings are consistent with

Macher et al. (2011). The coefficients of past inspection features (i.e., Last inspection-penalty, Last in-

spection-complaint) are positive and significant (p<0.05), which indicates that these factors increase the

inspection activity at a focal well. Further, the Wald tests for the Independence of Equations confirms that

the probit model with sample selection is appropriate in our setting. Having confirmed that the estimation

results for the selection equations are consistent and aligned with expectation, we turn to examine the es-

timation results for the outcome equations.

We evaluate H1a and H1b by examining the estimation results for the outcome equation that evalu-

ates learning from inspection experience gained at other units (i.e., specification 2). We refer to column

(O1) in Table 4 and observe that the coefficient of cumulative inspection-operator (x0) is not significant,

however, the coefficient of cumulative inspection-county (xs), is negative and significant (-0.00013,

p<0.001). This suggests that inspection experience gained at the operator does not enable a focal unit to

improve its environmental performance, but inspection experience gained within the county helps a focal

unit to improve its environmental performance. Thus, we find support for H1b. We calculate that one

standard deviation increase in cumulative inspection-county from its average value will lower the proba-

bility of violations by 24.91% (i.e., probability of violations reduces from 4.66% to 3.50%). By contrast,

the above results do not support H1a. We defer further discussion on H1a till we examine the results for

hypotheses H2a and H2b on the impact of different facets of inspection experience gained at the operator.

We turn to examine the estimation results for the outcome equation that evaluates whether a focal unit

learns from the inspection experience with success and failure gained at other units of the operator or

within the county (i.e., specification 3). In the context of H2a, we refer to column (O2) of Table 4 and

notice that the coefficient of cumulative success-operator (Ñ0) is not significant, but the coefficient of cu-

mulative failure-operator (Ñs) is negative and significant (-0.0019, p<0.001). One standard deviation in-

crease in cumulative failure-operator from its average value lowers the probability of violations being

detected in an inspection by 22.10% (from 5.09% to 3.97%). Further, a Wald test indicates that these co-

efficients are significantly different from each other (i.e., Ñs < Ñ0; p<0.05). These results provide partial

support for H2a because they indicate that a focal unit benefits from the inspection experience with failure

Page 21: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

20

gained at other units of the operator but not from the inspection experience with success. Next we evalu-

ate H3a and refer to column (O2) of Table 4. We observe that the coefficient of cumulative success-

county (Ñt) is not significant, but the coefficient of cumulative failure-county (Ñz) is negative and signifi-

cant (-0.0017, p<0.001). Further, a Wald test indicates that these coefficients are significantly different

from each other (i.e., Ñz < Ñt; p<0.001). One standard deviation increase in cumulative failure-county

from its average value lowers the probability of violations by 26.88% (from 5.09% to 3.72%). These re-

sults provide partial support for H3a because they indicate that a focal unit learns from the inspection ex-

perience with failure but not from the inspection experience with success gained at other units within the

county. Additionally, a Wald test indicates that Ñs < Ñz (p<0.001) which suggests that inspection experi-

ence with failure gained at other units within the organization has a higher impact on learning than such

experience gained at other units outside the organization.

To examine the impact of penalties on learning, we study the estimation results for the outcome equa-

tion that evaluates the impact of penalties on learning from inspection experience gained at other units of

the operator and at other units within the county (i.e., specification 4). To evaluate Hypothesis 2b, we re-

fer to column (O3) on Table 4 and find that the coefficient of cumulative penalty-operator (ÑÜ) is signifi-

cant and negative (-0.0126, p<0.01). Additionally, the coefficient of cumulative penalty no-operator (Ñá)

is also significant and negative (-0.0021, p<0.01). Further, a Wald test confirms that the impact of viola-

tions with penalties at the operator is significantly greater than the experience from violations without

penalties (i.e.,ÑÜ < Ñá, p<0.001). We calculate that one standard deviation increase in cumulative penal-

ty-operator (i.e., increase of around 11 inspections with violations and penalties) from its average value

lowers the probability of violations by 22.13% (from 5.16% to 4.02%). By contrast, the cumulative penal-

ty no-operator needs to increase by over 65 inspections with violation to achieve a similar impact. These

results support H2b and indicate penalties can contribute to improved environmental performance of a

focal unit when they are levied for the violations detected at the operator. For Hypothesis 3b, we observe

that the coefficient of cumulative penalty-county (Ñà) is not significant, but the coefficient of cumulative

penalty no-county (Ñ0â) is significant and negative (-0.0018, p<0.01). One standard deviation increase in

cumulative penalty no-county from its average value lowers the probability of violations by 26.14% (from

5.16% to 3.81%). These results do not support H3b and indicate that penalties do not contribute to learn-

ing from inspection experience gained at other units in the county. Overall, our results highlight the dif-

fering impact of penalties on learning within and outside the organization.

Our results for H2a and H2b offer limited support for H1a, because they provide evidence that a focal

unit benefits from specific types of inspection experience gained at other units of the operator.

6.2 Robustness Tests

Page 22: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

21

We did several tests to validate the robustness of our results.

To supplement the probit sample selection models used to handle the selection bias in our analysis,

we explored an alternative instrumental variables approach to address the endogeneity issues. In line with

the requirements for instruments (Wooldridge 2002), we used two instruments that are related to the in-

spection experience of an unconventional well but are otherwise unrelated to the error terms. The first

instrument uses the average inspection experience of a cluster of neighboring wells to instrument for the

inspection experience of each unconventional well. The cluster of neighboring wells was identified with

the k-median clustering algorithm that uses the latitude and longitude information of each well. The clus-

ter’s average inspection experience will be related to the inspection experience for each well because in-

spectors often try to conserve on travel by inspecting wells in geographically proximity. Additionally, the

cluster’s average inspection experience will not be related to the factors that influence a DEP to inspect a

specific unconventional well and therefore this variable will not be correlated with the error terms. These

factors make the cluster’s average inspection experience a valid instrument. Note that our approach is

similar to the one used in Cachon and Olivares (2010), who examine the impact of production flexibility

on finished goods inventory wherein they use the production flexibility of other models to instrument for

the potential endogenous production flexibility of a specific model. The second instrument uses the mu-

nicipality’s average inspection experience to instrument for the inspection experience of each well. The

logic that justifies the use of the second instrument is similar to the one used for the first instrument. Our

results remain essentially the same with both instruments, as shown in Table A1 of the online appendix.

Our analysis does not differentiate between different types of violations (other than the consideration

of penalties). It is possible that immediately corrected violations contribute to learning, whereas other vio-

lations (that take longer to rectify) do not contribute to learning. To investigate this issue, in our data we

identified 391 inspections with violations that were rectified immediately and 2,098 inspections with vio-

lations that took longer to rectify. We redid our analysis by only using information on violations that took

longer to rectify. Our results remain essentially the same as shown in Table A2 of the online appendix.

Our analysis used the cumulative count of inspections with violations and penalties to assess the im-

pact of penalties. This approach gives equal weight to all violations with penalties. However, the varia-

tions in the imposed penalties could affect learning. To address this issue, we modified our models to use

the cumulative value of penalties to assess the impact of penalties. These results are similar to those ob-

tained in our main models and are provided in Table A3 of the online appendix.

In our data, some unconventional wells had no inspections or recorded no production. We redid our

analysis by excluding the data from these wells. Our results remain consistent as shown in Table A4 of

the online appendix.

Page 23: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

22

Several studies have examined the impact of success and failure using logistic regression (e.g. Mad-

sen and Desai 2010, KC et al. 2013), we also redid our analysis with logistic regression, and our results

and conclusions remain essentially the same.

One concern is that specific types of violations could automatically attract penalties. If this occurs

then our results on penalties could be related to specific violation types and not to penalties explicitly.

However, in our data each type of violation with penalties has several occurrences without penalties (such

overlap is observed for over 99% of the violations). Thus, this issue is not a concern in our context.

7. Discussions and Conclusion This study extends the literature by casting light on how environmental inspections enable organizations

to develop knowledge, which helps them to improve their environmental performance. Our results show

that the inspection outcomes and associated monetary penalties affect the organizational learning. A focal

unit learns from the inspection experience of other units within the organization, when such inspections

identify environmental violations. The learning effect is augmented in the presence of monetary penalties.

The impact of inspection experience gained outside the organization is more restrictive. A focal unit only

learns from the inspection experience with failure of other units outside the organizations, but this effect

is absent in the presence of penalties. By contrast, a focal unit fails to learn from inspection experience

with success, irrespective of whether it is gained within or outside the organization.

Our results naturally lead to three broad questions. First, why does a focal unit fail to learn from the

successful experience gained at the larger organization but learns from the experience with failure at the

larger organization? Successful inspections provide evidence that the existing processes work and meet

environmental requirements. As a result, decision-makers may infer that further development of

knowledge is not required. Further, decision-makers can become overconfident about their knowledge,

which could make them less receptive to any suggestions for improvement from other sources, such as

environmental inspectors, vendor, or industry peers. These factors could hinder organizations from devel-

oping a deeper understanding of the routines and processes required to manage a broad range of environ-

mental issues. Consequently, a focal unit may be unable to leverage the inspection experience with suc-

cess gained at the other units within the organization. Thus, our study extends the individual level results

of KC et al. (2013) to the organizational unit level because we show that organizational units also fail to

learn from the successful experience of other units within the organization. By contrast, inspections that

identify violations indicate that the current processes and routines are not effective in meeting environ-

mental regulations, and therefore they provide pointers on how to improve environmental outcomes. Ad-

ditionally, discussions with the DEP personnel indicate that when DEP inspectors detect violations it pro-

vides them a platform to share their knowledge gleaned from visiting other facilities with organizational

Page 24: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

23

members. Moreover, identification of violations indicates that it is not “business-as-usual” (Louis and

Sutton 1991), which prompts decision-makers to switch their cognitive processes and makes them more

receptive to suggestions and information that can improve their environmental performance. These factors

could develop a deep understanding of the requirements to avoid a range of environmental issues within

organizations and a focal unit could leverage this experience to improve its environmental performance.

Second, why does a focal unit fail to learn from the successful experience gained at units in other or-

ganizations within the county but learns from their experience with failure? A key reason why a focal unit

may not benefit from the successful experience of other organizations is that many firms protect the

knowledge developed through success (e.g., Anton and Yao 2004, Katila et al. 2008), which makes this

knowledge more challenging to access. Additionally, when organizations gain experience with success

they may not take steps to explicitly retain this knowledge in organizational routines or equipment. As a

result, such knowledge will be tacit and research shows that tacit knowledge is less likely to spill over and

benefit other organizations (e.g., Argote and Ingram 2000). By contrast, when others experience failure, a

focal unit may be able to extract meaningful knowledge because of two reasons. First, a focal unit could

leverage common suppliers (e.g., Cheng and Nault 2007), knowledge of new recruits (e.g., Singh and

Agrawal 2011), and other informal mechanisms (e.g., vendor networks, social events, industry meetings

and events) to access the solutions adopted within the county to address failure. This also helps the focal

unit to circumvent some of the challenges involved in identifying and implementing solutions to address

the root causes of failures. Second, Baum and Dahlin (2007) point out that when individual organizations

lack sufficient experience with failure they turn to other’s failure to understand what not to do. Since un-

conventional wells within a county (even when they belong to other operators) face similar environmental

challenges, learning from their failure can be valuable. In our setting, an average focal unit experiences

less than one inspection that detects violations, which makes it likely that the manger of an unconvention-

al well may turn to other’s experience with failures.

Third, why do penalties augment the learning from failures experienced within the organization but

not from failures experienced outside the organization? The reasons can be traced to the different mecha-

nisms by which penalties affect the development of knowledge. Within the organization, penalties aug-

ment the learning effect in two ways. First, penalties signal the importance of specific environmental is-

sue to the organization, which could motivate organizations to redouble their efforts towards identifying

solutions to address such issues (e.g., King and Lenox 2000). Second, in many organizations, payment

towards penalties are not a part of operating budgets, which means that financial approvals are required to

process these payments. Often, as part of the approval process, managers would need to provide plans to

address the related environmental issues, which would support the knowledge development efforts. Addi-

Page 25: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

24

tionally, operators also need to provide the DEP with plans on how they intend to address the environ-

mental violations, which further supports knowledge development. By contrast, the environmental litera-

ture reasons that when penalties are imposed on a plant it signals the regulator’s overall willingness to

levy penalties, which deters violations at neighboring plants within the regulatory jurisdiction (Shimshack

and Ward 2005). The idea is that penalties on other organizations signal the seriousness of regulatory

scrutiny which increases the focal units efforts to comply with regulation. This reason may not be relevant

to our context because each operator faces multiple DEP inspections in a year, as opposed to prior studies

(e.g., Shimshack and Ward 2005, Gray and Shimshack 2011) where an average firm may face on average

one or fewer inspections in a year. Thus, in our setting, operators are already exposed to high level of reg-

ulatory scrutiny and would not need the additional signal from others’ penalties to understand the serious-

ness of the regulatory efforts. Additionally, our discussions with operators indicate that penalties do not

provide them more information on regulatory activity, instead they infer that the penalized unit may have

poor operational practices. As a result, neighboring plants fail to learn from the penalized unit as it is not

viewed as credible knowledge source (e.g., Szulanski 1996). Therefore, our results contribute to the envi-

ronmental literature by illustrating specific instances when penalties are not effective in improving envi-

ronmental performance.

Overall, our results provide pointers on how to improve environmental performance. Firms that face

violations could focus their efforts on understanding what other organizations did to address the environ-

mental issues. Further, firms within a region could create a common platform for sharing and exchanging

ideas on how to avoid environmental issues. Regulators could focus on making information on environ-

mental violations along with the solutions more transparent to facilitate the learning process. For instance,

although the DEP provides information through a public website on environmental violations, it could be

augmented by providing details of the solutions that were implemented to address specific environmental

issues. Additionally, to leverage the inspection experience with success, firms and the DEP could consid-

er identifying good operational practices and creating a platform for sharing this information so that the

industry can better comply with environmental regulations.

Limitations of our study could be addressed in future research. First, our investigation treats all types

of violations as equal; we did not consider the impact of different types of environmental violations.

While this is a good initial approximation, future studies can consider the impact of the different types of

violations in greater detail. Second, it could be reasoned that the organizations in our setting are not learn-

ing to improve their environmental performance but on the contrary they are learning to better hide their

violations. This is likely to happen if all the inspections of an unconventional well were done by the same

inspector. However, in our setting this concern is mitigated as over 70% of the unconventional wells were

Page 26: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

25

inspected by different inspectors in the period of our study. Third, we could not obtain the costs incurred

by the organizations to address each type of violation. Future studies could obtain detailed cost infor-

mation to assess how costs affect the development of organizational knowledge. Fourth, we were unable

to quantify the economic impact of reducing environmental violations and this could be an interesting

area for further research. Finally, our exploration of the impact of inspections is in the oil and gas indus-

try, and it remains to be seen whether our results can be generalized to other settings. We hope our work

will stimulate further work on the impact of inspections on environmental performance.

References

Agrawal, A., S. Muthulingam. 2015. Does Organizational Forgetting Affect Vendor Quality Perfor-

mance? – An Empirical Investigation. Manufacturing & Service Operations Management. 17(3): 350-

367.

Alcácer, J., W. Chung. 2007. Location Strategies and Knowledge Spillovers. Management Science. 53

(5):760-776.

Anand, G., J. Gray, and E. Siemsen. 2012. Decay, Shock, and Renewal: Operational Routines and Process

Entropy in the Pharmaceutical Industry. Organization Science. 23(6):1700-1716.

Andristos, D.A., C.S. Tang. 2014. Linking Process Quality and Resource Usage: An Empirical Analsis.

Production and Operations Management. 23(12): 2163-2177.

Anton, J.J., D.A. Yao. 2004. Little Patents and Big Secrets: Managing Intellectual Property. RAND

Journal of Economics. 35(1): 1-22.

Argote, L. 2013. Organizational Learning: Creating, Retaining and Transferring Knowledge. 2nd ed.

Springer, New York, USA.

Argote, L., S.L. Beckman, and D. Epple. 1990. The Persistence and Transfer of Learning in Industrial

Settings. Management Science. 36(2): 140-154.

Argote, L., E. Darr. 2000. Repositories of Knowledge in Franchise Organizations: Individual, Structural

and Technological. The Nature and Dynamics of Organizational Capabilities. 82(1): 51-65.

Argote, L., D. Epple. 1990. Learning Curves in Manufacturing. Science. 247(4945) 920-924.

Argote, L., P. Ingram. 2000. Knowledge Transfer: A Basis for Competitive Advantage in Firms.

Organizational Behavior and Human Decision Processes. 82(1): 150-169.

Argote, L., B. McEvily, B., and R. Reagans. 2003. Managing knowledge in organizations: An integrative

framework and review of emerging themes. Management Science. 49: 571–582.

Argote, L., E. Miron-Spektor. 2011. Organizational Learning: From Experience to Knowledge. Organiza-

tion Science. 22(5): 1123-1137.

Page 27: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

26

Arrow, K. 1963. Uncertainty and the Welfare Economics of Medical Care. American Economic Review.

53(5): 941-973.

Ball, G., E. Siemsen, and R. Shah. 2015. Inspector Experience and Product Recalls in the Medical Device

Industry. Working Paper.

Banker, R.D., J.M. Field, and K.K. Sinha. 2001. Work-Team Implementation and Trajectories of Manu-

facturing Quality: A Longitudinal Field Study. Manufacturing & Service Operations Manage-

ment. 3(1): 25-42.

Baum, J.A.C, K.B. Dahlin. 2007. Aspiration Performance and Railroads’ Patterns of Learning from Train

Wrecks and Crashes. Organization Science. 18(3): 368-385.

Baum, J.A.C., P. Ingram. 1998. Survival-Enhancing Learning in the Manhattan Hotel Industry, 1898–

1980. Management Science. 44(7): 996-1016.

Beckman, C.M., P.R. Haunschild. 2002. Network Learning: The Effects of Partners’ Heterogeneity of

Experience on Corporate Acquisitions. Administrative Science Quarterly. 47(1): 92-124.

Cachon, Gérard P., and Marcelo Olivares. 2010. Drivers of Finished-Goods Inventory in the U.S. Auto-

mobile Industry. Management Science Vol. 56., No.1 : 202-216.)

Cheng, Z., B.R. Nault. 2007. Industry Level Supplier-Driven IT spillovers. Management Science. 53(8):

1199-1216.

Corbett, C.J., R.D. Klassen. 2006. Extending the Horizons: Environmental Excellence as Key to

Improving Operations. Manufacturing & Service Operations Management 8(1): 5-22.

Cyert, R.M., J.G. March. 1963/1992. A Behavioral Theory of the Firm, 2nd ed. Prentice Hall, Englewood

Cliffs, NJ.

Darr, E. D., L. Argote, and D. Epple. 1995. The Acquisition, Transfer, and Depreciation of Knowledge in

Service Organizations: Productivity in Franchises. Management Science 41(11) 1750-62.

DEP. 2014. 2014 Oil and Gas Annual Report: Department of Environmental Protection, Pennsylvania.

Available at http://www.dep.pa.gov/Pages/default.aspx

Dhanorkar. S, E. Siemsen, and K. Linderman. 2015a. Promoting Change from the Outside: Directing

Managerial Attention in the Implementation of Environmental Improvements. Working Paper.

Dhanorkar, S., K. Donohue, and K. Linderman. 2015b. Repurposing Materials and Waste through Online

Exchanges: Overcoming the Last Hurdle. Production and Operations Management. 24(9):1473-1493.

ECHO. 2015. Enforcement and Compliance History Online. Available at http://echo.epa.gov/tools/data-

downloads.

Page 28: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

27

Engelder T. M. 2009. Report Card on the Breakout Year for Gas Production in the Appalachian Basin.

Fort Worth Basin Oil&Gas Magazine. August 18-22. Available at

http://www.marcellus.psu.edu/resource-s/PDFs/marcellusengelder.pdf.

EPA. 2015. How We Monitor Compliance. Available at http://www2.epa.gov/compliance/how-we-

monitor-compliance.

Epple, D., L. Argote, and K. Murphy. 1996. An Empirical Investigation of the Microstructure of

Knowledge Acquisition and Transfer through Learning by Doing. Operations Research. 44(1) 77–86.

Fu, W., B. Kalkanci , and R. Subramanian. An Empirical Investigation of Emissions Reductions under

Changing Assessments of Hazard. Working Paper.

Gray, W.B., J.P. Shimshack. 2011. The Effectiveness of Environmental Monitoring and Enforcement: A

Review of the Empirical Evidence. Review of Environmental Economics and Policy. 5(1): 3–24.

Greene, W.H., 2008. Econometric Analysis. 6th. Ed. Pearson Education Inc., Upper Saddle River, NJ.

Haunschild, P.R., B.N. Sullivan. 2002. Learning from Complexity: Effects of Prior Accidents and

Incidents on Airlines’ Learning. Administrative Science Quarterly. 47(4): 609-643.

Heckman, J. 1979. Sample Selection Bias as a Specification Error. Econometrica. 47: 153-161.

Jacobs, B.W., V.R. Singhal, and R. Subramanian. 2010. An Empirical Investigation of Environmental

Performance and the Market Value of the Firm. Journal of Operations Management. 28(5): 430-441.

Katila, R., J.D. Rosenberger, and K.M. Eisenhardt. 2008. Swimming with Sharks: Technology Ventures,

Defense Mechanisms and Corporate Relationships. Administrative Science Quarterly. 53(2): 295-332.

KC, D., B.R. Staats, and F. Gino. 2013. Learning from My Success and from Others’ Failure: Evidence

from Minimally Invasive Cardiac Surgery. Management Science. 59(11): 2435-2449.

Kim, J-Y.J., A.S. Miner. 2007.Vicarious Learning from the Failures and Near-Failures of Others:

Evidence from the US Commercial Banking Industry. Academy of Management Journal. 50(3): 687-

714.

Kim, J-Y, J-Y Kim, and A.S. Miner. 2009. Organizational Learning from Extreme Performance

Experience: The Impact of Success and Recovery Experience. Organization Science. 20(6): 958-978.

Kim, Sang-Hyun. 2015. Time to Come Clean? Disclosure and Inspection Policies for Green Production.

Operations Research. 63(1): 1-20.

King, A. and M. Lenox. 2000. Industry Self-Regulation Without Sanctions: The Chemical Industry's Re-

sponsible Care program. Academy of Management Journal. 43(4): 698–716.

Klassen, R.D., C.P. McLaughlin. 1996. The Impact of Environmental Management on Firm Performance.

Management Science. 42(8): 1199-1214.

Knott, A.M. 2001. The Dynamic Value of Hierarchy. Management Science. 47(3): 430-448.

Page 29: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

28

Knott, A.M., H.E. Posen, and B. Wu. 2009. Spillover Asymmetry and Why It Matters. Management

Science. 55(3): 373-388.

Kroes, J., R. Subramanian, and R. Subramanyam. 2012. Operational Compliance Levers, Environmental

Performance, and Firm Performance Under Cap and Trade Regulation. Manufacturing & Service Op-

erations Management. 14(2): 186-201.

Lapré, M.A., A.S. Mukherjee, and L.V. Wassenhove. 2000. Behind the Learning Curve: Linking

Learning Activities to Waste Reduction. Management Science. 46(5) 597-611.

Levinthal, D., J.G. March. 1981. A Model of Adaptive Organizational Search. Journal of Economic

Behavior & Organization. 2(4): 307-333.

Levitt, B., J.G. March. 1988. Organizational Learning. Annual Review of Sociology. 319-340.

Li, G., S. Rajagopalan. 1997. The Impact of Quality on Learning. Journal of Operations Management.

15(3): 181-191.

Li, G., S. Rajagopalan. 1998. Process Improvement, Quality, and Learning Effects. Management Science.

44(11): 1517-1532.

Louis, M.R., R.I. Sutton. 1991. Switching Cognitive Gears: From Habits of Mind to Active Thinking.

Human Relations. 44: 55–76.

Macher, J.T., J.W. Mayo, and J.A. Nickerson. 2011. Regulator Heterogeneity and Endogenous Efforts to

close the Information Asymmetry Gap. Journal of Law and Economics. 54(1): 25-54.

Madsen, P.M., V. Desai. 2010. Failing to Learn? The Effects of Failure and Success on Organizational

Learning in the Global Orbital Launch Vehicle Industry. Academy of Management Journal. 53(3):

451-476.

Nembhard, I.M., A.L. Tucker. 2011. Deliberate Learning to Improve Performance in Dynamic Service

Settings: Evidence from Hospital Intensive Care Units. Organization Science. 22(4) 907-22.

Plambeck, E.L. 2013. OM forum-Operations Management Challenges for Some “Cleantech” Firms.

Manufacturing & Service Operations Management. 15(4): 527-536.

Porteous, A.H., S.V. Rammohan, and H.L. Lee. 2015. Carrots or Sticks? Improving Social and Environ-

mental Compliance at Suppliers Through Incentives and Penalties. Production and Operations Man-

agement. 24(9): 1402-1413.

Potoski, M., A. Prakash. 2011. Voluntary Programs, Compliance and the Regulation Dilemma. In D. Le-

vi- Faur (Ed.), Handbook on the Politics of Regulation 84-95. Cheltenham: Edward Elgar Pub.

Rajaram, K., C.J. Corbett. 2002. Achieving Environmental and Productivity Improvements Through

Model-Based Process Redesign. Operations Research. 50(5): 751-763.

Page 30: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

29

Singh, J., A. Agrawal. 2011. Recruiting for Ideas: How Firms Exploit the Prior Inventions of New Hires.

Management Science. 57(1): 129-150.

Shaver, J.M., F. Flyer. 2000. Agglomeration Economies, Firm Heterogeneity, and Foreign Direct

Investment in the United States. Strategic Management Journal. 21(12):1175-1193.

Shimshack, J.P., M.B. Ward. 2005. Regulator Reputation, Enforcement, and Environmental Compliance.

Journal of Environmental Economics and Management. 50(3): 519–540.

Short, J.L., M.W. Toffel, and A.R. Hugill. 2015. Monitoring Global Supply Chains. Strategic Manage-

ment Journal.

Szulanski, G. 1996. Exploring Internal Stickiness: Impediments to the Transfer of Best Practice Within

the Firm. Strategic Management Journal. 17(S2): 27–43.

Thirumalai, S., K.K. Sinha. 2011. Product Recalls in the Medical Device Industry: An Empirical Explora-

tion of the Sources and Financial Consequences. Management Science. 57(2): 376-392.

Van de Ven, W.P., B.M. Van Praag. 1981. The Demand for Deductibles in Private Health Insurance: A

Probit Model with Sample Selection. Journal of Econometrics. 17(2):229-252.

Winter, S.G., G. Szulanski, D. Ringov, and R.J. Jensen. 2012. Reproducing Knowledge: Inaccurate Rep-

lication and Failure in Franchise Organizations. Organization Science. 23(3): 672-685.

Wooldridge, J. 2002. Econometric Analysis of Cross Section and Panel Data. 2nd ed. MIT Press, USA.

Wright, T.P. 1936. Factors Affecting the Cost of Airplanes. Journal of the Aeronautical Sciences. 3(4)

122-28.

Page 31: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

30

Figure 1: Proportion of Violations Being Detected in Inspections Over Time

Table 1: Select Examples of Environmental Violations Identified by DEP

Note: Pa.Code indicates regulation under Pennsylvania’s Oil and Gas Act.

0.146

0.125

0.062

0.030 0.025 0.018

0

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

2009 2010 2011 2012 2013 2014

Prop

ortio

n

Year

Description of Violation used by DEP Pennsylvania Legal Citation

1 DISCHARGE REQUIREMENTS - Operator discharged a substance into waters of the Commonwealth contrary to 25 Pa. Code Chapters 91, 93, 95 and 102, The Clean Streams Law and the 2012 Oil and Gas Act.

25 Pa. Code § 78.60(a)

2 DISPOSAL OF DRILL CUTTINGS - Owner or operator disposed of drill cuttings from below the casing seat, in a pit that does not meet the requirements of 25 Pa. Code Sections 78.62(a)(5)-(18) and Section 78.62(b).

25 Pa. Code § 78.61(c)1

3 GENERAL PROVISIONS - CASING AND CEMENTING - Operator conducted casing and cementing activities that failed to prevent the migration of gas or other fluids into coal seams.

25 Pa. Code § 78.81(a)4

4 PIPELINES UNDER STREAM BEDS – Owner or permittee failed to meet the minimum amount required for cover between the top of the pipe or encasement and the lowest point in the stream bed.

25 Pa. Code § 105.313(a)

5 PITS AND TANKS FOR TEMPORARY CONTAINMENT - Operator failed to contain pollutional substances and wastes from the drilling, altering, completing, recompleting, servicing and plugging the well, including brines, drill cuttings, drilling muds, oils, stimulation fluids, well treatment and servicing fluids, plugging and drilling fluids other than gases in a pit, tank or series of pits and tanks.

25 Pa. Code § 78.56(a)

6 WELLS IN A HYDROGEN SULFIDE AREA - Operator failed to operate the well in which hydrogen sulfide was detected in concentrations of 20 ppm or greater in a way that presents no danger to human health and environment.

25 Pa. Code § 78.77(c)

Number

Page 32: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

31

Table 2: Summary Statistics

Table 3: Correlations

Variable Mean Standard Deviation

Minimum Maximum Number Variable Mean Standard Deviation

Minimum Maximum Number

(1) Violation 0.046 0.210 0 1 53,718 (14) Cumulative Failure-Well 0.25 0.71 0 9 53,718(2) Cumulative Inspection-Operator 1,320 1,235 0 5,405 53,718 (15) Cumulative Penalty-Well 0.05 0.27 0 4 53,718(3) Cumulative Inspection-County 1,336 1,359 0 6,725 53,718 (16) Cumulative Penalty No-Well 0.20 0.63 0 8 53,718(4) Cumulative Success-Operator 1,247 1,190 0 5,121 53,718 (17) Cumulative Production Gas (million cu ft) 211 589 0 11,657 53,718(5) Cumulative Failure-Operator 72.6 69.9 0 284 53,718 (18) Cumulative Production Oil (000's barrels) 37.2 649.0 0 35,003 53,718(6) Cumulative Penalty-Operator 12.7 11.4 0 46 53,718 (19) Good Reputation 0.930 0.260 0 1 10,231,440(7) Cumulative Penalty No-Operator 59.9 63.2 0 253 53,718 (20) Bad Reputation 0.005 0.068 0 1 10,231,440(8) Cumulative Success-County 1,234 1,276 0 6,400 53,718 (21) Detriorating Reputation 0.030 0.170 0 1 10,231,440(9) Cumulative Failure-County 102 107 0 417 53,718 (22) Prior Rate of Violations at Operator 0.033 0.095 0 1 10,231,440

(10) Cumulative Penalty-County 16.0 14.5 0 75 53,718 (23) Prior Rate of Violations at County 0.040 0.078 0 1 10,231,440(11) Cumulative Penalty No-County 86.3 99.3 0 387 53,718 (24) Last Inspection-Penalty 0.005 0.072 0 1 10,231,440(12) Cumulative Inspection-Well 4.13 5.02 0 57 53,718 (25) Last Inspection-Complaint 0.035 0.180 0 1 10,231,440(13) Cumulative Success-Well 3.88 4.77 0 53 53,718 (26) Days Since Last Inspection 222 284 0 2,164 10,231,440

(1) (2) (3) (4) (5) (6) (7) (8) (9) (10) (11) (12) (13) (14) (15) (16) (17) (18) (19) (20) (21) (22) (23) (24) (25)(1) Violation 1.000(2) Cumulative Inspection-Operator -0.008 1.000(3) Cumulative Inspection-County -0.007 0.359 1.000(4) Cumulative Success-Operator -0.008 0.999 0.356 1.000(5) Cumulative Failure-Operator -0.004 0.676 0.270 0.642 1.000(6) Cumulative Penalty-Operator -0.004 0.734 0.134 0.724 0.643 1.000(7) Cumulative Penalty No-Operator -0.004 0.619 0.274 0.583 0.991 0.538 1.000(8) Cumulative Success-County -0.007 0.363 0.998 0.362 0.254 0.143 0.256 1.000(9) Cumulative Failure-County -0.005 0.220 0.760 0.206 0.377 -0.004 0.416 0.722 1.000

(10) Cumulative Penalty-County -0.007 0.243 0.615 0.251 0.031 0.018 0.031 0.607 0.538 1.000(11) Cumulative Penalty No-County -0.004 0.201 0.726 0.185 0.399 -0.006 0.440 0.686 0.993 0.434 1.000(12) Cumulative Inspection-Well 0.001 0.202 0.111 0.208 0.028 0.094 0.015 0.125 -0.075 -0.085 -0.068 1.000(13) Cumulative Success-Well -0.001 0.219 0.112 0.226 0.019 0.102 0.004 0.127 -0.092 -0.080 -0.087 0.987 1.000(14) Cumulative Failure-Well 0.007 -0.032 0.034 -0.037 0.058 -0.015 0.067 0.030 0.069 -0.055 0.082 0.404 0.252 1.000(15) Cumulative Penalty-Well 0.003 -0.026 -0.025 -0.025 -0.035 -0.001 -0.039 -0.022 -0.057 -0.027 -0.057 0.254 0.186 0.473 1.000(16) Cumulative Penalty No-Well 0.007 -0.025 0.048 -0.031 0.080 -0.017 0.091 0.043 0.102 -0.051 0.116 0.349 0.206 0.930 0.114 1.000(17) Cumulative Production Gas -0.005 0.302 0.264 0.292 0.358 0.264 0.349 0.263 0.209 0.071 0.214 0.173 0.155 0.157 0.036 0.162 1.000(18) Cumulative Production Oil -0.001 0.084 -0.011 0.087 0.010 0.122 -0.010 -0.005 -0.073 0.077 -0.089 -0.001 0.000 -0.010 0.039 -0.028 0.023 1.000(19) Good Reputation -0.008 0.056 0.024 0.062 -0.055 -0.001 -0.060 0.031 -0.055 0.043 -0.064 -0.058 0.027 -0.501 -0.198 -0.482 -0.080 0.008 1.000(20) Bad Reputation 0.007 -0.028 -0.023 -0.029 0.006 -0.010 0.008 -0.025 0.010 -0.016 0.013 0.014 -0.027 0.239 0.070 0.241 0.011 -0.007 -0.241 1.000(21) Detriorating Reputation 0.007 -0.038 -0.016 -0.042 0.034 0.007 0.036 -0.020 0.034 -0.031 0.041 0.014 -0.034 0.275 0.107 0.265 0.049 -0.015 -0.624 -0.012 1.000(22) Prior Rate of Violations at Operator 0.008 -0.152 -0.160 -0.156 -0.039 -0.036 -0.036 -0.163 -0.087 -0.117 -0.077 -0.084 -0.093 0.024 0.009 0.024 -0.064 -0.019 -0.069 0.042 0.065 1.000(23) Prior Rate of Violations at County 0.007 -0.227 -0.232 -0.231 -0.085 -0.095 -0.077 -0.239 -0.091 -0.136 -0.078 -0.130 -0.143 0.029 0.008 0.029 -0.088 -0.027 -0.080 0.045 0.070 0.425 1.000(24) Last Inspection-Penalty 0.003 -0.023 -0.033 -0.023 -0.024 0.004 -0.027 -0.032 -0.034 -0.019 -0.034 0.009 -0.010 0.112 0.303 -0.001 0.009 -0.002 -0.256 0.065 0.382 0.031 0.034 1.000(25) Last Inspection-Complaint 0.001 0.076 0.064 0.077 0.038 0.097 0.026 0.066 0.029 0.059 0.023 0.038 0.035 0.031 0.020 0.027 0.074 0.033 -0.057 0.009 0.086 -0.021 -0.036 0.044 1.000(26) Days Since Last Inspection -0.008 0.289 0.230 0.286 0.227 0.228 0.211 0.233 0.134 0.155 0.122 0.048 0.038 0.071 0.039 0.064 0.332 0.085 -0.083 0.026 0.028 -0.081 -0.128 0.002 0.046

Page 33: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

32

Table 4: Estimation Results to Examine the Impact of Inspection Experience on Violations

Notes: * p<0.05, ** p<0.01, *** p<0.001. We report coefficient estimates with cluster robust standard errors in parentheses. All models are significant at p<0.001.

(O1) (O2) (O3) (S1) (S2) (S3)Constant -2.16530*** -2.34401*** -2.27678*** Constant -2.09988*** -2.09988*** -2.09989***

(0.5906) (0.5974) (0.5980) (0.0360) (0.0360) (0.0360)H1a, H1b - (Inspection Experience at Other Units) Good Reputation -0.03792** -0.03792** -0.03792**

Cumulative Inspection-Operator (θ1) -0.00003 (0.0134) (0.0134) (0.0134)(0.00003) Bad Reputation 0.22784*** 0.22784*** 0.22786***

Cumulative Inspection-County (θ2) -0.00013*** (0.0392) (0.0392) (0.0392)(0.00002) Deteriorating Reputation 0.14724*** 0.14724*** 0.14727***

(0.0185) (0.0185) (0.0185)H2a, H2b - (Learning from within the Organization) Last Inspection-Penalty 0.08452*** 0.08452*** 0.08453***

Cumulative Success-Operator (β1) 0.00002 0.00007 (0.0145) (0.0145) (0.0145)(0.00003) (0.00004) Last Inspection-Complaint 0.08833** 0.08836** 0.08821**

Cumulative Failure-Operator (β2) -0.00194** (0.0326) (0.0326) (0.0326)(0.0007) Prior Rate of Violations at Operator 0.01221 0.01222 0.01223

Cumulative Penalty-Operator (β7) -0.01261** (0.0248) (0.0248) (0.0248)(0.0039) Prior Rate of Violations at County -0.33393*** -0.33394*** -0.33394***

Cumulative Penalty No-Operator (β8) -0.00207** (0.0391) (0.0391) (0.0391)(0.0007) Number 10,231,440 10,231,440 10,231,440

Wells 11,039 11,039 11,039H3a, H3b - (Learning from outside the Organization)

Cumulative Success-County (β3) -0.00006 -0.00005 Wald Test for Independence of Equations(0.00003) (0.00004) Rho 0.043 0.044 0.043

Cumulative Failure-County (β4) -0.00172** Selection-Chi-Square 15.93*** 16.21*** 15.856***(0.0006)

Cumulative Penalty-County (β9) -0.00208(0.0032)

Cumulative Penalty No-County (β10) -0.00179** (0.0006)

Controls: Focal Unit- Inspection and Production ExperienceCumulative Inspection-Well 0.00591*

(0.0029)Cumulative Success-Well 0.00138 0.00197

(0.0036) (0.0035)Cumulative Failure-Well 0.04699**

(0.0170)Cumulative Penalty-Well -0.03159

(0.0381)Cumulative Penalty No-Well 0.06088**

(0.0191)Cumulative Gas (million cubic feet) 0.00001 0.00001 0.00001

(0.00002) (0.00002) (0.00002)Cumulative Oil (thousand barrels) -0.000001 -0.000002 -0.000004

(0.00001) (0.00001) (0.00001)

Controls: Focal Unit- Prior Inspection RecordGood Reputation -0.01319 0.05632 0.05405

(0.0534) (0.0567) (0.0567)Bad Reputation 0.31190** 0.24358* 0.24255*

(0.0974) (0.1029) (0.1040)Deteriorating Reputation 0.08120 0.08537 0.08559

(0.0630) (0.0639) (0.0640)Last Inspection-Penalty 0.27516*** 0.28344*** 0.28426***

(0.0571) (0.0571) (0.0573)Days Since Last Inspection 0.00007 0.00005 0.00004

(0.0001) (0.0001) (0.0001)Prior Rate of Violations at Operator 0.70491*** 0.65358*** 0.65476***

(0.1182) (0.1190) (0.1193)Prior Rate of Violations at County -0.26060 -0.31820* -0.30534

(0.1561) (0.1599) (0.1603)Fixed Effects

Inspector, Operator, County, Year Fixed Effects Yes Yes YesLog-Likelihood -278,156 -278,103 -278,097Wells 11,039 11,039 11,039Number of Inspections 53,718 53,718 53,718

Selection Equation(D.V.: Well Inspected = 1 if well was inspected, 0 otherwise)

Outcome Equation(Dependent Variable (D.V.): Violations = 1 if any violation was detected, 0 otherwise)

Page 34: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

33

Online Supplement This section presents the results for a selection of the robustness checks we did to validate the results

reported in the paper. Table A1 provides the estimations results for the regression models with

instrumental variables. Table A2 provided the estimation results for the regression models that only

consider information for violations that were not immediately rectified (i.e., by excluding violations that

were corrected immediately). Table A3 provided the estimation results for the regression models that use

the cumulative amount of penalties imposed for violations (instead of cumulative count of violations with

penalties). Table A4 provided the estimation results for the regression models that exclude data from

unconventional wells that had no inspection or no production. Overall the results from these analyses

support the findings reported in the main paper.

Page 35: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

34

Table A1: Instrumental Variables Probit Estimates to Examine the Impact of Inspection Experience on Violations

Notes: * p<0.05, ** p<0.01, *** p<0.001. We report coefficient estimates with cluster robust standard errors in parentheses. All models are significant at p<0.001.

(1) (2) (3) (4) (5) (6)Constant -2.1151*** -2.1807*** -2.1383*** -2.1646*** -2.2619*** -2.1911***

(0.6488) (0.6380) (0.6364) (0.6484) (0.6350) (0.6342)H1a, H1b - (Inspection Experience at Other Units)

Cumulative Inspection-Operator (θ1) -0.00002 -0.00003(0.00003) (0.00003)

Cumulative Inspection-County (θ2) -0.0001*** -0.0001***(0.00003) (0.00003)

H2a, H2b - (Learning from within the Organization)Cumulative Success-Operator (β1) 0.00005 0.00008* 0.00004 0.00007

(0.00004) (0.00004) (0.00004) (0.00004)Cumulative Failure-Operator (β2) -0.00324** -0.0027*

(0.0011) (0.0011)Cumulative Penalty-Operator (β7) -0.0212** -0.0212**

(0.0067) (0.0070)Cumulative Penalty No-Operator (β8) -0.0031** -0.0026*

(0.0010) (0.0011)

H3a, H3b - (Learning from outside the Organization)Cumulative Success-County (β3) -0.00001 -0.00001 -0.00002 -0.00002

(0.00004) (0.00004) (0.00004) (0.00004)Cumulative Failure-County (β4) -0.0024** -0.0022*

(0.0009) (0.0009)Cumulative Penalty-County (β9) 0.0013 0.0034

(0.0070) (0.0070)Cumulative Penalty No-County (β10) -0.0027** -0.0024**

(0.0009) (0.0009)

Controls: Focal Unit- Inspection and Production ExperienceCumulative Inspection-Well 0.0119** 0.02201***

(0.0041) (0.0048)Cumulative Success-Well 0.0122** 0.0122* 0.0288*** 0.0288***

(0.0047) (0.0047) (0.0059) (0.0059)Cumulative Failure-Well -0.0033 -0.0461

(0.0264) (0.0314)Cumulative Penalty-Well -0.0708 -0.1840

(0.0836) (0.0999)Cumulative Penalty No-Well 0.0026 -0.3137

(0.0287) (0.0338)Cumulative Gas (million cubic feet) -6.87E-07 -3.92E-06 -5.73E-06 -2.80E-05 -3.01E-05 -3.27E-05

(0.00003) (0.00003) (0.00003) (0.00003) (0.00003) (0.00003)Cumulative Oil (thousand barrels) -0.0209 -0.0328 -0.0308 -0.0226 -0.0270 -0.0268

(0.0163) (0.0233) (0.0207) (0.0159) (0.0152) (0.0151)

Controls: Focal Unit- Prior Inspection RecordGood Reputation -0.0065 -0.0143 -0.0148 0.0053 -0.0388 -0.0376

(0.0638) (0.0675) (0.0676) (0.0635) (0.0695) (0.0695)Bad Reputation 0.4932*** 0.4971*** 0.4944*** 0.4892*** 0.5463*** 0.5328***

(0.1338) (0.1377) (0.1387) (0.1331) (0.1375) (0.1388)Deteriorating Reputation 0.1642* 0.1612* 0.1598* 0.1704* 0.1757* 0.1685*

(0.0812) (0.0820) (0.8243) (0.0811) (0.0818) (0.8236)Last Inspection-Penalty 0.1222 0.1284 0.1463 0.1173 0.1221 0.1809

(0.1490) (0.1508) (0.1539) (0.1488) (0.1507) (0.1536)Days Since Last Inspection 0.0002* 0.0002* 0.0002* 0.0002* 0.0002* 0.0002*

(0.0001) (0.0001) (0.0001) (0.0001) (0.0001) (0.0001)Prior Rate of Violations at Operator 0.8733*** 0.8388*** 0.8368*** 0.8817*** 0.8717*** 0.8699***

(0.1301) (0.1307) (0.1309) (0.1305) (0.1311) (0.1312)Prior Rate of Violations at County -0.2995 -0.4327* -0.4003* -0.3331 -0.4693* -0.4466*

(0.1815) (0.1887) (0.1882) (0.1810) (0.1859) (0.1860)Fixed Effects

Inspector, Operator, County, Year Fixed Effects Yes Yes Yes Yes Yes YesLog-Likelihood -93,430.16 -91,736.45 -90,426.48 -10,197.97 -10,146.68 -11,501.54Wells 10,790 10,590 10,590 10,790 10,590 10,590Number of Inspections 45,574 45,574 45,574 45,574 45,574 45,574

(Dependent Variable (D.V.): Violations = 1 if any violation was detected, 0 otherwise)(Instrument: Inspection Experience of Cluster

of Neighboring Wells)(Instrument: Inspection Experience of Wells in

Municipality)

Page 36: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

35

Table A2: Estimation Results to Examine the Impact of Inspection Experience for Violations that were not immediately corrected

Notes: * p<0.05, ** p<0.01, *** p<0.001. We report coefficient estimates with cluster robust standard errors in parentheses. All models are significant at p<0.001.

(O1) (O2) (O3) (S1) (S2) (S3)Constant -2.824*** -3.024*** -2.971*** Constant -2.108*** -2.108*** -2.108***

(0.7002) (0.7146) (0.7135) (0.0360) (0.0360) (0.0360)H1a, H1b - (Inspection Experience at Other Units) Good Reputation -0.03787** -0.03787** -0.03787**

Cumulative Inspection-Operator (θ1) 0.00000 (0.0134) (0.0134) (0.0134)(0.00003) Bad Reputation 0.2249*** 0.2249*** 0.2249***

Cumulative Inspection-County (θ2) -0.00014*** (0.0395) (0.0395) (0.0395)(0.00003) Deteriorating Reputation .1452*** .1452*** 0.1452***

(0.0185) (0.0185) (0.0185)H2a, H2b - (Learning from within the Organization) Last Inspection-Penalty .08996** 0.0900** 0.08989**

Cumulative Success-Operator (β1) 0.00005 0.00008* (0.0326) (0.0326) (0.0326)(0.00004) (0.00004) Last Inspection-Complaint .08184*** 0.08184*** 0.08184***

Cumulative Failure-Operator (β2) -0.00211** (0.0146) (0.0146) (0.0146)(0.00077) Prior Rate of Violations at Operator 0.00296 0.00296 0.00296

Cumulative Penalty-Operator (β7) -0.0090* (0.0251) (0.0251) (0.0251)(0.0041) Prior Rate of Violations at County -0.3354*** -0.3354*** -0.3354***

Cumulative Penalty No-Operator (β8) -0.0020* (0.0395) (0.0395) (0.0395)(0.0008) Number 10,231,049 10,231,049 10,231,049

Wells 11,034 11,034 11,034H3a, H3b - (Learning from outside the Organization)

Cumulative Success-County (β3) -0.00007 -0.00008* Wald Test for Independence of Equations(0.00004) (0.00004) Rho 0.035 0.035 0.034

Cumulative Failure-County (β4) -0.0017** Selection-Chi-Square 9.39*** 9.36*** 8.96***(0.0006)

Cumulative Penalty-County (β9) 0.00231(0.0034)

Cumulative Penalty No-County (β10) -0.0018** (0.0006)

Controls: Focal Unit- Inspection and Production ExperienceCumulative Inspection-Well 0.0075*

(0.0030)Cumulative Success-Well 0.00294 0.00355

(0.0036) (0.0036)Cumulative Failure-Well 0.05162**

(0.0168)Cumulative Penalty-Well -0.01635

(0.0396)Cumulative Penalty No-Well 0.0652***

(0.0187)Cumulative Gas (million cubic feet) -0.00002 -0.00002 -0.00002

(0.00003) (0.00003) (0.00003)Cumulative Oil (thousand barrels) 0.000004 0.000003 0.000003

(0.00001) (0.00001) (0.00001)

Controls: Focal Unit- Prior Inspection RecordGood Reputation -0.02608 0.04454 0.04311

(0.0552) (0.0589) (0.0590)Bad Reputation 0.3156** 0.2417* 0.2383*

(0.0982) (0.1032) (0.1040)Deteriorating Reputation 0.08374 0.08652 0.08632

(0.0662) (0.0671) (0.0672)Last Inspection-Penalty 0.2695*** 0.2796*** 0.2810***

(0.0605) (0.0605) (0.0606)Days Since Last Inspection 0.0002* 0.00015 0.00014

(0.0001) (0.0001) (0.0001)Prior Rate of Violations at Operator 0.7002*** 0.6465*** 0.6527***

(0.1200) (0.1204) (0.1207)Prior Rate of Violations at County -0.17440 -0.23200 -0.23190

(0.1617) (0.1660) (0.1666)Fixed Effects

Inspector, Operator, County, Year Fixed Effects Yes Yes YesLog-Likelihood -275,505 -275,450 -275,446Wells 11,034 11,034 11,034Number of Inspections 53,718 53,718 53,718

Outcome Equation Selection Equation(Dependent Variable (D.V.): Violations = 1 if any violation was detected, 0 otherwise) (D.V.: Well Inspected = 1 if well was inspected, 0 otherwise)

Page 37: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

36

Table A3: Estimation Results to Examine the Impact of Inspection Experience by Considering the Cumulative Amount of Penalties Imposed

Notes: * p<0.05, ** p<0.01, *** p<0.001. We report coefficient estimates with cluster robust standard errors in parentheses. The model is significant at p<0.001.

Outcome Equation Selection Equation

(O3) (S3)Constant -2.364*** Constant -2.1***

(0.5947) (0.0360)H2a, H2b - (Learning from within the Organization) Good Reputation -0.0379**

Cumulative Success-Operator (β1) 0.00003 (0.0134)(0.00003) Bad Reputation 0.2279***

Cumulative Penalty Amount-Operator (β7) -7.90e-08** (0.0392)(0.00000003) Deteriorating Reputation 0.1473***

Cumulative Penalty No-Operator (β8) -0.002** (0.0185)(0.0007) Last Inspection-Penalty 0.08823**

(0.0326)H3a, H3b - (Learning from outside the Organization) Last Inspection-Complaint 0.0845***

Cumulative Success-County (β3) -0.00006 (0.0146)(0.00004) Prior Rate of Violations at Operator 0.01222

Cumulative Penalty Amount-County (β9) -1.55E-08 (0.0248)(0.00000003) Prior Rate of Violations at County -0.3339***

Cumulative Penalty No-County (β10) -0.0017** (0.0391)(0.0006) Number 10,231,440

Wells 11,039Controls: Focal Unit- Inspection and Production Experience

Cumulative Success-Well 0.00181 Wald Test for Independence of Equations(0.0035) Rho 0.044

Cumulative Penalty Amount-Well -1.73E-07 Selection-Chi-Square 16.12***(0.0000002)

Cumulative Penalty No-Well .06094** (0.0193)

Cumulative Gas (million cubic feet) 0.00001(0.00002)

Cumulative Oil (thousand barrels) -0.000001(0.00001)

Controls: Focal Unit- Prior Inspection RecordGood Reputation 0.05605

(0.0565)Bad Reputation 0.2386*

(0.1042)Deteriorating Reputation 0.08454

(0.0639)Last Inspection-Penalty 0.2848***

(0.0572)Days Since Last Inspection 0.00005

(0.0001)Prior Rate of Violations at Operator 0.6432***

(0.1193)Prior Rate of Violations at County -0.30670

(0.1602)Fixed Effects

Inspector, Operator, County, Year Fixed Effects YesLog-Likelihood -278,099Wells 11,039Number of Inspections 53,718

(Dependent Variable (D.V.): Violations = 1 if any violation was detected, 0 otherwise)

(D.V.: Well Inspected = 1 if well was inspected, 0 otherwise)

Page 38: Does Learning from Inspectionspublic.kenan-flagler.unc.edu/2017msom/SIGs... · burden on operations. However, to the best of knowledge, not much research has explored how environ-

37

Table A4: Impact of Inspection Experience on Violations – Excluding Data Without Inspections and Production

Notes: * p<0.05, ** p<0.01, *** p<0.001. We report coefficient estimates with cluster robust standard errors in parentheses. The model is significant at p<0.001.

(O1) (O2) (O3) (S1) (S2) (S3)Constant -2.051*** -2.215*** -2.143*** Constant -2.098*** -2.098*** -2.098***

(0.6165) (0.6204) (0.6206) (0.0359) (0.0359) (0.0359)H1a, H1b - (Inspection Experience at Other Units) Good Reputation -.03541** -.03541** -.03541**

Cumulative Inspection-Operator (θ1) -0.00003 (0.0134) (0.0134) (0.0134)(0.00003) Bad Reputation .2274*** .2274*** .2274***

Cumulative Inspection-County (θ2) -0.00013*** (0.0393) (0.0393) (0.0393)(0.00002) Deteriorating Reputation .148*** .148*** .148***

(0.0185) (0.0185) (0.0185)H2a, H2b - (Learning from within the Organization) Last Inspection-Penalty .08771** .08773** .08761**

Cumulative Success-Operator (β1) 0.00002 0.00007 (0.0326) (0.0326) (0.0326)(0.00003) (0.00004) Last Inspection-Complaint .08092*** .08092*** .08092***

Cumulative Failure-Operator (β2) -0.0021** (0.0146) (0.0146) (0.0146)(0.0007) Prior Rate of Violations at Operator 0.01044 0.01045 0.01046

Cumulative Penalty-Operator (β7) -0.0131*** (0.0249) (0.0249) (0.0249)(0.0039) Prior Rate of Violations at County -.3375*** -.3375*** -.3375***

Cumulative Penalty No-Operator (β8) -0.0022** (0.0392) (0.0392) (0.0392)(0.0007) Number 10,174,522 10,174,522 10,174,522

Wells 9,800 9,800 9,800H3a, H3b - (Learning from outside the Organization)

Cumulative Success-County (β3) -0.00006 -0.00006 Wald Test for Independence of Equations(0.00003) (0.00004) Rho 0.033 0.034 0.033

Cumulative Failure-County (β4) -0.0016** Selection-Chi-Square 9.11*** 9.35*** 9.05***(0.0006)

Cumulative Penalty-County (β9) -0.00240(0.0032)

Cumulative Penalty No-County (β10) -0.0017** (0.0006)

Controls: Focal Unit- Inspection and Production ExperienceCumulative Inspection-Well 0.0055

(0.0029)Cumulative Success-Well 0.00085 0.00143

(0.0036) (0.0035)Cumulative Failure-Well 0.0477**

(0.0171)Cumulative Penalty-Well -0.03225

(0.0382)Cumulative Penalty No-Well 0.0618**

(0.0192)Cumulative Gas (million cubic feet) 0.00001 0.00001 0.00001

(0.0000) (0.0000) (0.0000)Cumulative Oil (thousand barrels) -0.000002 -0.000003 -0.000006

(0.00001) (0.00001) (0.00001)

Controls: Focal Unit- Prior Inspection RecordGood Reputation -0.01430 0.05437 0.05405

(0.0535) (0.0567) (0.0567)Bad Reputation .3111** .241* 0.24255*

(0.0974) (0.1041) (0.1040)Deteriorating Reputation 0.07911 0.08375 0.08559

(0.0631) (0.0640) (0.0640)Last Inspection-Penalty .2738*** .283*** 0.28426***

(0.0572) (0.0574) (0.0573)Days Since Last Inspection 0.00006 0.00003 0.00004

(0.0001) (0.0001) (0.0001)Prior Rate of Violations at Operator .7079*** .6577*** 0.65476***

(0.1186) (0.1198) (0.1193)Prior Rate of Violations at County -0.27190 -0.316 -0.30534

(0.1571) (0.1616) (0.1603)Fixed Effects

Inspector, Operator, County, Year Fixed Effects Yes Yes YesLog-Likelihood -277,848 -277,795 -277,788Wells 9,800 9,800 9,800Number of Inspections 52,531 52,531 52,531

Outcome Equation Selection Equation(Dependent Variable (D.V.): Violations = 1 if any violation was detected, 0 otherwise) (D.V.: Well Inspected = 1 if well was inspected, 0 otherwise)