Bureau of Economic and Business Research   
 
_______________________________________________________________________

 

1995 Abstracts for Working Papers

 

95-0144
This paper is designed to review the experience in the western portion of Nevada with pre-railroad transport, as a background to developing the interdependence of railroads and mining. Most of the early railroads were built to serve mining areas, and they declined with the fall of mining, failing to develop adequate nonmining traffic. Much of the capital investment in the railways would have yielded far more gain--to the investors and to society--if they had been made elsewhere, but mining dominated the decision making about railways and provided the capital in many instances to build up the lines. Of the nine railroads covered in the study, two were clearly economically warranted and possibly two were so justified if considered in conjunction with their mining sector owners. But a substantial portion of the total mileage proved to be unwarranted. The railroads demonstrated an amazing facility to reduce costs as traffic fell, and to continue to operate for very long periods on a volume of traffic typically regarded as incapable of supporting a rail line, but in the end they were doomed.


95-0143
This paper introduces a contracting model which extends over multiple time periods, with a penniless entrepreneur, an investor, differential information, and choice-theoretic costly enforcement. Time-consistent, enforceable contracts are analyzed and shown to be less state contingent and entail less randomization than contracts which are not time-consistent. Specifically, we show that simple debt (with deterministic enforcement) it is optimal even in the presence of stochastic enforcement. Contract negotiation, an alternative to enforcement, is then analyzed. We show that if renegotiation occurs, it must be offered stochastically for incentive reasons. Thus agreements with deterministic features and stochastic features co-exist. These results appear to accord well with the robust empirical observation of both debt and contract renegotiation.


95-0142
We examine a pay-your-bid auction in which purchases may win zero, one, or two units. We derive the first order conditions for symmetric equilibria in such an auction, and show that they imply both separating and pooling of bids must occur with positive probability in the auction.
The behavior in this auction thus contrasts dramatically with multi-unit uniform-price auctions: it also contrasts with the behavior in asymmetric first price auctions for a single good, whose first order conditions bear a superficial resemblance to those presented here.
We also provide numerical results for the case of a uniform distribution. Our results demonstrate that behavior of multi-unit auctions in which purchasers can receive a variable number of units should not be predicted based on behavior of auctions in which bidders can only receive one unit, even if multiple units are sold.


95-0141
We present partial results showing that risk-sensitive oligopolists would spend less on advertising than would their risk-neutral counterparts. The model is an infinite-horizon stochastic game in which, at the beginning of each time period, the level of each firm's "goodwill" is a random function of its own current and past advertising expenditures, as well as the current and past advertising expenditures of each of its competitors. Therefore, the advertising done by each firm in the current period influences the goodwill measures of all firms in subsequent periods. The profit generated by each firm depends on its own level of goodwill and current advertising and on the goodwill levels and current advertising of each of its competitors. We assume that single-period firm profits have a market share attraction form. The objective of each firm is to maximize the expected utility of the sum of discounted profits generated over an infinite horizon. The utility function we employ allows us to model explicitly the risk sensitivity of the streams of random rewards accruing to each firm. We analyze the impact that risk sensitivity and other parameters have on equilibrium advertising strategies by exploiting the special structure of the stochastic game model.


95-0140
The infinitely repeated prisoner's dilemma has a multiplicity of Pareto unranked equilibria. This leads to a battle of the sexes problem of coordinating on a single efficient outcome. One natural method of achieving coordination is for the players to bargain over the set of possible equilibrium allocations. Unfortunately, there are many different cooperative bargaining solutions from which to choose, and players may not agree on which is most preferred. In the event of disagreement over bargain solutions, it is reasonable to expect agents to randomize over their favorite choices. This paper asks the following question: do the players risk choosing an inefficient outcome by resorting to such randomizations? In general, randomizations over points in a convex set yields interior points. We show, however, that if the candidate solutions are the two most frequently used - the Nash and Kalai-Smorodinsky solutions - then for any prisoner's dilemma, this procedure guarantees coordination of an efficient outcome.


95-0139
Historically, additions to public infrastructure necessitated by urban growth have been financed under a cost-sharing approach. In the last several decades, however, financing of growth has increasingly relied on land-use exactions, where new residents pay for the cost of incremental infrastructure. Despite the emergence of a formal growth-control literature, there has been virtually no formal analysis of the connection between infrastructure financing and urban development. To provide such an analysis, the paper investigates three different schemes for financing incremental infrastructure within an urban growth model. The analysis compares an impact-fee scheme to two types of cost-sharing schemes, deriving the effects on urban growth and land values of switching to the impact-fee scheme. The efficient financing scheme is also identified.


95-0138
The nursing home industry has been described as having the existence of excess demand. In the states where excess demand is present, nursing homes may have an incentive to reduce quality. This is due to the home receiving more revenue from Medicaid patients which increases the marginal cost of increasing quality to compete for more private pay patients. If excess demand leads to reduced quality as Medicaid rates are increased, a corollary problem that should be of interest to policy makers is the impact on quality of loosening CON regulations to the extent that excess supply could result. The goals of this paper are to determine the impact of raising the Medicaid reimbursement rate on the quality that homes provide and the impact of the occupancy rate of the home and the payer type of the occupancy rate (high private pay or high Medicaid) of the home on the level of quality as the Medicaid reimbursement rate changes. We believe our estimates of this model are superior to earlier estimates because of our continuous measure of quality. The results, contrary to earlier studies, provide evidence that raising Medicaid rates and/or eliminating CON's have positive effects on the quality of nursing home care for the entire sample of homes not just those homes thought to be in an excess supply market. The resulting elasticity was 1.31. Furthermore, homes with higher occupancy rates increase quality more than those with lower occupancy rates as the Medicaid reimbursement rate increases. Additionally, the breakdown of the payer types in the home has a significant impact of the change in quality provided when the Medicaid rate is changed. When excess supply exists, homes with more private pay patients will increase quality more than homes with more Medicaid patients as the Medicaid reimbursement rate is increases.


95-0137
Product competition in a growing number of markets is undergoing a profound transformation. New kinds of "flexible" products and organizations are enabling some firms to pursue innovative product strategies that offer markets unprecedented levels of product variety and change. This article explores changes in strategies for product design and for organizing and coordinating development processes that are driving this transformation or product competition.
Strategic management concepts and product strategies associated with stable, evolving, and dynamic product markets are compared. Concepts of modularity in products and organizations are identified as the core concepts behind new kinds of product strategies now emerging in dynamic product markets. Differences between modular product design and conventional approaches to designing products are explained, and the competitive advantages of modular product design strategies are elaborated.
Modularity in product designs decouples processes for developing components, allowing those processes to become concurrent autonomous, and distributed and making possible the adoption of modular organization designs for product development. "Quick-connect" electronic interfaces that allow firms to create an electronically mediated product development network adds to the flexibility of the modular product creating processes.
Modularity in products and organizations also requires new concepts for strategically managing knowledge. Creating modular product architectures requires an intentional and disciplined decoupling of technology development and product development. As a consequence, however, modular product design leads to better understanding of the state of a firm's knowledge and makes possible more effective strategic management of technology development. A hierarchy of knowledge that distinguishes know-how, know-why, and know-what forms of knowledge is presented as a basis for developing new strategies for leveraging knowledge in product creation networks.
This article concludes by arguing that success--and perhaps even survival-- in product competition will increasingly depend on more effective strategic management of product and organization architectures.


95-0136
Public pension funds are a significant, and rapidly growing, financial force. However, the lack of a consensus on the appropriate funding level is apparent in the wide diversity of funding levels currently maintained. This research proposes a financial standard for public pension plan funding that depends on the current pension obligation and the growth rates of pension expenses and tax base, and then compares the optimal funding levels, over time, with actual funding levels by state. Based on this approach, funding levels should vary by state, but many states are funding public pension at levels well below the optimal values, which is likely to lead to serious long term problems.


95-0135
A mechanism coalitionally implements a social choice set if any outcome of the social choice set can be achieved as a coalitional Bayesian Nash equilibrium of a mechanism and vice versa. We say that a social choice set is coalitionally implementable if there is a mechanism which coalitionally implements it. Our main theorem proves that a social choice set is coalitionally implementableif and only if it is individually rational, Pareto optimal, coalitional Bayesian incentive compatible, and satisfies a coalitional Bayesian monotonicity condition as well as a closure condition. As an application of our main result, we show that the private core and private Shapley value of an economy with differential information are coalitionally implementable.


95-0134
This paper suggests that an essential task in building a competence-based theory of strategy is to integrate previously unconnected theories singularly focused on the economic content or the cognitive processes of strategy making. We discuss the integration of such dissociative theories at three levels: (1) the strategy making and testing processes of managers competing in specific contexts; (2) the theory building and testing processes of researchers looking for insights that are generalizable across competitive contexts; and (3) the interactions between managers and researchers in building a general theory of competence that also works in specific contexts.
To accomplish these ends, we suggest that strategy researchers and managers should be engaged in an interactive, reciprocating process in building competence theory. We propose that researchers and managers embark on a new theory building process in which the generalized theories of researchers and the contextual theories of managers may evolve in a dynamic of double-loop learning.


95-0133
In this paper, we provide axiomatic foundations for social choice rules on a domain of convex and comprehensive social choice problems when agents have cardinal utility functions. We translate the axioms of three well known approaches in bargaining theory (Nash [1950], Kalai and Smorodinsky [1975], and Kalai [1977]) to the domain of social choice problems and provide an impossibility result for each. We then introduce the concept of a reference function which, for each social choice set, selects a point from which relative gains are measured. By restricting the invariance and comparison axioms so that they only apply to sets with the same reference point, we obtain characterizations of social choice rules that are natural analogues of the bargaining theory solutions.


95-0132
A model, incorporating the real-time updating of expectations, for the evaluation of services and its testing are reported in this paper. This model extends the work of Boulding, Kalra, Staelin, and Zeithaml (1993). Principally, the proposed model suggests that customers' evaluations of a service encounter are partly based on expectations generated during the encounter itself. Results show that 87% of the paths in the proposed model are significant. The results also show the possible existence of a differential-filtering effect of information generated during the encounter. The real-time updating model is perhaps also uselful in modeling other interactive encounters, such as in negotiations, personal selling, interactive shopping.


95-0131
This paper proposes a framework for analyzing a firm's knowledge assets and for devising effective processes for leveraging and controlling different kinds of knowledge in competence-based competition. "Tacit knowledge" is critiqued for its limited ability to be leveraged and controlled. A framework for identifying categories or articulated knowledge is developed by examining differences in the contexts and the contents of articulated knowledge and the process by which articulated knowledge is transferred or diffused between contexts. In managing processes of knowledge articulation, codification, and apprehension strategically, it is useful to recognize three distinct kinds of knowledge--know-how, know-why, and know-what--that will be of different relative strategic importance in different competitive contexts. This framework is applied to three different competitive contexts to illustrate strategies for managing knowledge that appear to be effective in leveraging less critical kinds of articulated knowledge while controlling more critical kinds of knowledge. Concluding comments suggest issues for further research of interest to both researchers and managers.


95-0130
We study a model of a local public goods economy in which a formal distinction is made between the crowding effect and tastes of agents. It has been shown that decentralization of the core is possible in the crowding types model with admission prices that take into account only publicly observable information. Decentralization of the core with anonymous prices is also possible in the nondifferentiated crowding model using a Lindahl prices system when technology is linear. Lindahl prices systems are superior to admission price systems in that they can be specified with a finite set of numbers. In this paper we explore the possibility of Lindahl decentralization of the core in a crowding types model. We show that the core is equivalent to the set of nonanonymous Lindahl equilibria. In contrast to the nondifferentiation crowding case, however, the core is generally larger than the set of anonymous Lindahl equilibria regardless of technology.


95-0129
The paper develops a framework for comparing formal and informal sector wages that accounts for the specific characteristics of each sector. It then examines wage differentials of individual workers moving among four sectors, the formal salaried and three informal sectors, controlling for unobserved worker characteristics and for sector characteristics. Little evidence is found for segmentation between the formal and informal sectors. Informal self-employment and contract work are found to pay as well as or better than formal salaried work. However, informal salaried workers are found to earn uniformly less than all other classes of work.


95-0128
The paper argues that the informal sector provides desirable employment options in low productivity economies such as Mexico's, and does not primarily comprise workers rationed out of the formal sector. Using transition matrices, multinomial logit techniques, and survey data, it studies the mobility patterns of Mexican workers between the formal sector and three informal sectors. Overall mobility in the labor market is found to be high. Informal salaried workers before they enter either longer term formal or informal sector jobs and informal sector contract work is not found obviously undesirable. There is evidence for a life cycle view where workers enter formal employment to gain financial and human capital, and then voluntarily enter informal self-employment.


95-0127
When a nuisance parameter is not identified under the null hypothesis, the information matrix is singular. Therefore, the conventional tests cannot be implemented for testing the null hypothesis. Testing for the regression coefficient's stability suffers from this problem. We examine Davies' (1977, 1987) tests, and propose a joint LM test for autocorrelation and heteroskedasticity as an alternative. The computational costs of the joint LM test is trivial compared to other tests. Our Monte Carlo study demonstrates that the joint LM test has good finite sample power properties. An empirical application is also provided illustrate our test procedure.


95-0126
We present a model of multi-unit auctions in which some of the units may be sold before the auction (i.e., "noncooperatively") at a price to be established by the auction. All potential buyers have similar information and costs but, at equilibrium, some buy non-competitively while the remainder bid in the auction and the seller benefits from non-competitive sales.


95-0125
Auctions in which individuals can purchase more than one unit of the good being sold differ in striking ways from auctions in which a single good is sold. The uniform price auction in particular yields more Nash equilibria in which bidders divide the good while paying very low prices for the good. This paper characterizes equilibria for the auction.


95-0124
The spatial mismatch hypothesis, first stated by Kain (1968), argues that job decentralization in US cities has contributed to low income and high unemployment rates for black Americans. Decentralization relocates job sites to where white suburban communities far from the CBD, and housing segregation prevents blacks from relocating their residences near the new workplaces. The purpose of the paper is analyze an urban equilibrium with spatial mismatch. Despite the existence of a suburban employment center, blacks in the model are forced to live in the central zone they occupied in the original monocentric city commuting across the white residential area to access suburban jobs. This "mismatch" equilibrium is contrasted with an unrestricted equilibrium where blacks are free to locate wherever they choose.


95-0123
In predicting whether a negotiator's concession behavior will be reciprocated (Osgood 1962) or will be exploited (Siegel & Fouraker 1960), empirical findings provide mixed support to either prediction (Rubin & Brown 1975; Druckman 1977; Pruitt 1981.) This study examined three factors that may foster exploitation or reciprocation: (1) the state of a negotiation process, (2) a negotiator's level of masculinity, and (3) a negotiator's level of femininity. Data were collected from 135 business undergrad students, who participated in a computer simulated negotiation game. The results indicated that, in general, negotiators reciprocated the opponent's concessions at the initial stage of the negotiation process. Moreover, the results indicate that, at the initial stage of the negotiation process only low-femininity negotiators reciprocated opponent's concession (i.e., conceded more when opponent conceded more), while high-femininity negotiators responded to the demand of the opponent (i.e. conceded more when opponent conceded less). At the final stage of the negotiation process, counter-intuitively, low masculinity negotiators exploited the opponent's concessions (i.e., conceded less when opponent conceded more), but high-masculinity negotiators' concessions were not influenced by their opponent's concession amount.


95-0122
We relate the predictability of future returns from past returns to the market's underreaction to information, focusing on past earnings news. Past returns and past earnings surprise each predicts large drifts in future returns after controlling for the other. There is little evidence in subsequent reversals in the return of stocks with high price in earnings momentum. Market risk, size and book-to-market effects do not explain the drifts. Security analysts, earnings forecast also respond sluggishly to past news, especially in the case of the stocks with worst past performance. The results suggest a market that responds only gradually to new information.


95-0121
This paper investigates interrelationships of product design, organization design, processes for creating and leveraging knowledge and competitive strategy. This paper uses the principle of nearly decomposable systems to investigate the ability of standardized interfaces between components in a product design and between development organizations using a shared CADD/CIM system to imbed coordination of development and production processes. Imbedded coordination creates "hierarchical coordination" without the need to continually exercise authority--enabling effective coordination of processes without the tight coupling of organizational structures. The current paper develops concepts of modularity in the product and organization designs based on standardized component and organization interfaces. Modularity in product and organization design created information structures that can reduce the cost and difficulty of adaptive coordination thereby increasing the strategic flexibility of firms to respond to environmental change. Modularity in product and organization designs also requires a new approach to the management of knowledge based on intentional, carefully managed loose coupling of a firm's knowledge-leveraging processes and its knowledge-creation processes.


95-0120
We consider a problem of a principle who wishes to induce two agents playing one shot prisoners dilemma to behave cooperatively. We assume the principle cannot observe the actions of the agent, and is not able change the strategy sets or payoff functions in the underlying game. The only power the principle has is to randomly delay the arrival of payoffs. Specifically agents choose their one shot strategies and the principle randomly determines whether these are "cheap talks", or if payoffs should be distributed. If the round is cheap talk then each agent observes the strategy choice of the other and moves to a new round. This continues until payoffs are distributed. We establish conditions under which the probability of cheap talk can be chosen at the beginning of the induced game in such a way that full cooperation is the only equilibrium outcome. The sufficiency condition is met by the wide class of economic interpretations of the prisoners' dilemma, including those involving strategic complemenarities among players.


95-0119
We propose a new model of a local public goods economy with differentiated crowding. The new feature is that taste and crowding characteristics of agents are distinguished from one another. We prove if the economy satisfies strict small group effectiveness then the core equivalent of Tiebout's equilibrium outcomes. The equilibrium prices are defined to depend only on crowding characteristics. This implies that only publicly observable information, and not private information such as preferences, is needed to induce agents to sort themselves into efficient jurisdictions. Thus, our model allows us to satisfy Bewley's (1981) anonymity requirement on taxes in his well-known criticism of Tiebout's hypothesis.


95-0118
We consider a new model of a local public goods economy with differentiated crowding in which we make a distinction between the taste and crowding characteristics of agents. It is possible in this model to have a taste homogeneous jurisdiction that takes advantage of the full array of positive crowding effects (labor complementarities, for example.) We nevertheless find that tastes-homogeneous in jurisdictions with the same crowding profile. We also provide an example which illustrates the difficulties in extending the intuitive results from the hedonic pricing literature to Tiebout's economy with differentiated crowding.


95-0117
This paper presents an integrated system for establishing bond ratings that is user friendly, intuitive and credible. The proposed system is jointly based on foundation of valuation theory--cash flow information--and an analytical system which measures the amount of uncertainty contained in the cash flow information--tree based inductive learning. We show that the financial health of a firm is closely related to the performance of its cash flow components. The induced tree selects the cash flow components that are most important in determining the bond ratings. A major concern for bond analysts when using either inductive learning systems or statistical analysis is the instability of the classification results. To reduce the large variability in the structure of the induced tree, a Global Tree Interpretation Process
(Gtip) that is based on a jackknife procedure, is developed. An experiment using the proposed credit rating system was conducted on a set of commercial loans from a large regional bank in the United States. Using Gtip the proposed credit rating system had a predictive accuracy of 88.4 percent. Positive feedback from the bankers provided substantial credibility on the use of the proposed credit rating system. The total results are encouraging and suggest the need for further research.


95-0116
One of the most important ideas in public economies in Tiebout's hypothesis that if public good are "local" then markets should be able to overcome the free-rider problem. In this paper we discuss the different approaches to formalizing the Tiebout hypothesis as a decentralization theorem. Special attention is devoted to the structure of the price systems required for decentralization. We argue that unless prices are anonymous in the sense that they do not discriminate between agents on the basis of unobservable characteristics (tastes, for example), they do not decentralize in the same sense as Walrasian prices do in private goods economies. We consider the theorems available for three basic local public goods models; anonymous crowding, differentiated crowding, and a new model called crowding types. We also discuss anonymous pricing in the context of market games, and some results which show more generally that in large economies, prices can be anonymous.


95-0115
Starrett (1972) argues that the presence of externalities implies fundamental nonconvexities which cause Arrow markets to fail. While this is true, we argue this failure is due to the structure of the Arrovian markets that Starrett uses, and not to the presence of externalities as such. We provide an extension of a general equilibrium public goods model in which property rights are explicitly treated. Nonconvexities are not fundamental in this framework. We define a notion of Coasian equilibrium for this economy, and show first and second welfare theorems. In this context, the first welfare theorem is a type of Coase theorem.


95-0114
A two-sector economy model is set up in an uncertain lifetimes framework. One of the sectors is monopolistically competitive. It is shown that a balanced budget fiscal expansion increases steady state welfare of the representative individual and also along the transition path.


95-0113
We compare execution costs (market impact plus commission) on the NYSE and on NASDAQ for institutional investors. The differences in cost generally conform to each market's area of specialization. Controlling for firm size, trade size and the money management firm's identity, costs are lower on NASDAQ for trades in comparatively smaller firms (with 1991 market capitalization below 1.2 billion). For the smallest firms, the cost advantage under a pre-execution benchmark is 0.68 percent. However, trading costs for the larger stocks are lower on NYSE. For the largest stocks (above $4.5 billion in capitalization), costs are lower by 0.48 percent on NYSE.


95-0112
Jørgen Pedersen (1890-1973) introduced Keynesian ideas in Denmark but was more than just another Keynesian. Pedersen (1937) articulated fiscal activism as Hansen (1941) was to do four years later. In their basic models neither Keynes himself nor his American followers found room for labor unions wage or price. Pedersen (1944, 1948, 1964) and (1951) found such room and analyzed the macroeconomic consequences of a collision between wage policy and monetary policy. His analysis was intuitive, but the present article offers a rigorous restatement of it.


95-0111


95-0110
In recent years, ARCH models have emerged as an indispensable tool for modeling the conditional second moment of economic variables, and therefore, proper formulation of the conditional variance function is of the utmost importance. In order to provide a unified approach to the problem of finding stationary conditions and the test statistics for various specifications of conditional heteroskedasticity, we propose a general random coefficient disturbance process which encompasses AR, ARCH and GARCH process. Through the vector representation of the model, we use a new procedure to derive stationarity conditions for our general disturbance process and discuss the interaction between autocorrelation and conditional heteroskedasticity. We also show that the stationarity conditions for AR, ARCH and GARCH models can be obtained as a special case of our result. Test statistics for conditional heteroskedasticity and autocorrelation are proposed. Through an illustrative example of estimating the variability of inflation, we show how misspecifying conditional heteroskedasticity or neglecting autocorrelation can affect inference about the conditional second moment of a random variable.


95-0109


95-0108
This paper presents an integrated system for establishing credit ratings that is user friendly, intuitive and credible. The proposed system is jointly based on the foundation of valuation theory--cash flow information--and an analytical system that measures the amount of uncertainty contained in the cash flow information--tree based inductive learning. We show that the financial health of a firm is closely related to the performance of its cash flow components. The induced tree selects the cash flow components that are most important in determining the credit ratings. A major concern for credit analysts when using either inductive learning systems or statistical analysis is the instability of the classification results. To reduce the large variability in the structure of the induced tree, a Global Tree Interpretation Process (Gtip), that is based on a jackknife procedure, is developed. An experiment using the proposed credit rating system was conducted on a set of commercial loans from a large regional bank in the United States. Using Gtip the proposed credit rating system had a predictive accuracy of 88.4 percent. Positive feedback from the bankers provided substantial credibility on the use of the proposed credit rating system. The total results are encouraging and suggest the need for further research.


95-0107
The primary objective of this paper is to present a sound approach to assessing a company's strategic performance. The proposed approach is based on the longitudinal patterns of five key cash flow components. The relationships among these five cash flow components provide the core for interpreting a firm's strategic performance. An example using information from selected food companies is presented to highlight the contributions that key cash flow components make to the interpretation of a company's strategic performance.


95-0106
Jensen and Murphy have advocated that the level of a CEOs compensation is not as important as the "how" of the incentive package. They identified two sets of twenty-five CEOs whose compensations package is "best" and "least" aligned with the interests of the shareholders. An analysis of these "best" and "worst" aligned firms' stock returns indicates that the market did not place a consistent return premium on pay for performance CEO firms versus misaligned CEO incentive firms during 1989 through 1990. These findings persist when one controls for CEO change as well as regulatory-interest rate influences. Although empirical data appear the contradict the Jensen and Murphy position, capital market efficiency arguments suggest that one might expect these results to occur.


95-0105
In this paper we propose simple diagnostic tests, based on OLS residuals, for spatial error autocorrelation (or spatially lagged dependent variable) in the presence of a spatially lagged dependent variable (or spatial error autocorrelation), applying the modified LM test developed by Bera and Yoon (1993). Our new tests may be viewed as computationally simple and robust alternatives to some existing procedures in spatial econometrics. We provide empirical illustrations to demonstrate the usefulness of the proposed tests. The finite sample size and power performance of the tests are also investigated through a Monte Carlo study. The results indicate that the adjusted LM tests have good finite sample properties. In addition, they prove to be more suitable for the identification of the source of dependence (lag or error) than their unadjusted counterparts.


95-0104
Recent shareholder activism suggests that the shareholder-manager conflict of interests, and the effectiveness of mechanisms to narrow this divergence, is an issue of continuing importance. This study examines the linkages between institutional, market and legal mechanisms to control managerial discretion in management buyouts, and the circumstances under which each type of mechanism is effective. We find that these mechanisms do act to control managerial discretion in management buyouts to some degree. At the same time, there appear to be significant frictions which act to partially insulate managers from these types of governance, limiting their effectiveness.


95-0103
Global competition, rapid changes in technology, and market fragmentation have resulted in shorter product life cycles. In order to remain viable, it is increasingly important for firms to introduce new products frequently. Product design is a complex process that involves coordination of activities among several functional disciplines within the company as well as the customers and the suppliers. Traditionally, t he information flow among the various products development stages has been sequential. However, there is increasing evidence to support that an integrated approach that considers several stages simultaneously may be superior.
This paper provides a decision support tool for implementing such an integrated approach. On the basis of given customer preferences, the paper presents a model for determining the number of new products to be introduced, the exact specifications of these products, and the production processes for efficiently delivering these specifications. These decisions are made in an integrated manner by simultaneously considering the interaction among the various choice variables. A decomposition-based solution procedure is developed that iterates between the product design and process selection decisions while maintaining an effective link between them.
In addition to understanding the economic value of adopting the integrated approach to product design, the paper discusses how the proposed model can be used effectively to perform sensitivity analysis with respect to some of the important decision variables.


95-0102
In this study, we investigate the impact of state-level prohibitions on the founding and mortality rates of breweries in prohibition-free states. Our results suggest that particularistic institutional action such as nonuniform Government regulation leads to unanticipated consequences of two kinds. First, it creates resource-related opportunities for organizations that are not directly affected by such action. Second, it imposes indirect coercive pressures by influencing cultural expectations in the environment of organizations that are not directly affected by such action. We also find that the overall direction and strength of these unanticipated effects vary with the centrality of organizations in terms of their location, the time elapsed since an environmental change, and organizational age.


95-0101
This paper develops a growth-control model more realistic than those available in the literature by replacing the usual class of absentee landowners by resident landowners, who live within the city. The analysis shows that resident landowners have a weaker taste for growth controls than their absentee counterparts. The reason is that since the resident landowners pay rent to themselves, a control-induced escalation of rent for the land they occupy confers no benefits. Absentee landowners, by constrast, gain from rent increases throughout the city, and thus favor more stringent controls. The model is generalized in a number of directions.


95-0100
In recent years, ARCH models have emerged as an indispensable tool for modeling many financial and economic time series. In this paper we consider whether the wide acceptance of the ARCH process may be at the expense of other nonlinear processes, such as bilinear models, which can offer alternative modeling approaches and possibly and improvement in the predictive power of econometric models. We first propose a test which should have good pwer against the simultaneous presence of ARCH and bilinearity. A non-nested test is then suggested to determine whether non-linear dependence should be attributed to ARCH or bilinearity. The tests are then applied to three series, namely the rate of return on the S&P 500 stock index, the rate of return on the British Pound and the growth rate of the U.S. index of industrial production. Results indicate that nonlinear dependence may not be fully attributable to ARCH, but ARCH still appears to be a better model in terms of statistical properties and out of sample forecasts for the data series considered here.