Bureau of Economic and Business Research   

1997 Abstracts for Working Papers


Finance professors rarely study how students learn finance. This leads to a relatively low awareness of how students learn, which in turn provides a foundation for studying student learning styles. The paper presents an overview of the learning style literature with a focus on Gregorc's learning style theory. This empirical study collects the learning styles of 483 undergraduate students from three universities. A hierarchical loglinear model is used to test various hypotheses on the relationship among students' sex, race, major and learning style. The analysis shows there is no difference in the learning styles of African Americans and Caucasians. It also indicates the learning styles of female students are significantly different from male students. Likewise, the study shows a significant difference exists between student race and the selection of an academic major. The results of the study suggest numerous challenges to our profession concerning the need to provide an equitable learning experience for all students enrolled in finance courses. Finally the study has profound implications for faculty development, as well as instructional and curricula development.

Latin American governments increasingly want to attract external capital to increase economic growth and adjust to external shocks. However, their concurrent desire to retain national sovereignty has frequently conflicted with the aim of creditors to receive payment for providing this capital. First, this paper examines how the participation of official and private sector actors, ability of these actors to coordinate their actions, form and maturity structure of external capital and existing political and international monetary environment have influenced the conditions demanded by creditors and the willingness of Latin American governments to accept constraints on national sovereignty during episodes of capital inflows since independence in the 1820s. Second, we investigate how the changing nature of creditor demands, new forms of capital inflows, implicit government guarantees and international monetary environment since 1990 influence the constraints placed on policy makers in the aftermath of the Mexican crisis.

We consider a local public goods economy with differential crowding in which agents are distinguished by their tastes and genetic endowments. Agents choose which crowding characteristic they wish to express, and this affects their value to other members of their jurisdiction. An agent's choice is influenced both by his genetic endowment, which affects his cost of acquiring any given crowding characteristic, and his preference over which crowding characteristic he expresses. For example, an agent may find it very easy to learn accounting but may strongly desire to be an artist. We show that in general jurisdictions in the core will not be taste-homogeneous. This contrasts with earlier results for models with endogenous crowding types. We also show that the core is equivalent to the set of anonymous competitive equilibrium outcomes. This implies that the market will not allow agents to be discriminated against on the basis of genetic endowments; the only feature of an agent that is relevant to the market is the crowding characteristic that an agent chooses to express.




This paper adds a land market to a standard Harris-Todaro framework. In the standard model, the equilibrating force that limits rural-urban migration is a decline in the probability of formal employment, which follows from enlargement of the informal sector. The key insight of the present paper, borrowed from Brueckner (1990), is that urban land-rent escalation provides an additional force that limits the extent of migration. The most striking implication of this modified model is that formal-sector growth may not lead to additional migration from rural areas. The reason is that, because of land-rent escalation, such growth may depress a migrant's expected utility despite the improved chance of obtaining formal job. In the second part of the analysis, the efficiency-wage model is used to make wages and employment in the formal sector endogenous instead of fixed. While many comparative-static effects are ambiguous in this more-complex model, the role of the land market is basically unaffacted.

A rapid penetration of PCs and eased accessibility of Internet infrastructure fuels the digitization of products (e.g. newspapers) and services (e.g. voice communications) and is creating the opportunity to develop, build, and operate a new utility facilitating electronic markets and electronic commerce which is referred to as Digital Interactive Services (DIS). Many companies from the four industries of content, communications, computing and financial services are trying to seize the DIS opportunity by aligning capabilities and assets through mergers and acquisitions, which is referred to as convergence of information industries. Many companies in these industries and in particular the ones with size advantages feel more comfortable to "sit and wait" until more stable structures evolve. But considering increasing returns economics and positive network externalities inherent in DIS activities and instant global scale, early hesitation might result in an outright loss of the entire business opportunity. In this paper, we suggest a framework that aims at providing insights into the emerging structures of DIS which are in turn the foundation for strategy formulation. The framework consists of a generic concept for value creation and delivery in DIS referred to as the "2-3-6" concept and the definition of dynamic patterns such as strategic roles on top of it and will be applied in the context of electronic publishing (EP).

Rao's (1947) seminal paper introduced a fundamental principle of testing based on the score function as an alternative to likelihood ratio and Wald tests. Neyman's (1959) approach, in view of the presence of nuisance parameters, emphasized the generality and attractive features of the score-based tests. Silvey (1959) rediscovered the score test as a Lagrange multiplier test. In later years, Breusch and Pagan's (1980) exposition of score test in a general framework in the context of econometric modeling resulted in an increased activity on specification testing in econometrics. In this paper we trace these historical developments emphasizing optimality features of tests based on scores and their usefulness in practical problems in statistics and econometrics. In so doing we give some new results, present easier computation of score-based tests and alternative derivaitons of some known results. We also discuss a connection between Rao's score test and the seemingly unrelated literature of Fisher's discriminant function, Mahlanobis' D2 and Hotelling's T2.

Are there economic incentives for electronic commerce, or is it just hype? This paper evaluates the cost-based differences between traditional markets (such as retail stores) and electronic markets both from the buyer (demand side) perspective and the seller (supply side) perspective. A cost-based model that differentiates between traditional and electronic markets identified, and an empirical, survey-based, study that provides support for the model is discussed. This paper discusses the implications that a shift toward greater electronic market utilization have for transaction intermediaries, interactive service providers (ISPs), and government. We find that there are significant cost-based differences between traditional and electronic markets for buyers, and that electronic markets affect future sources of organization revenue.

The greater the benefit from cornering an auction, the more enforcement is needed to deter cornering. We presume that there is just enough enforcement to deter cornering attempts and rank three difference auctions by their expected revenue net of enforcement costs. We consider three different types of auctions: the pay-your-bid or "discriminatory" auction commonly used by the US Treasury, the lowest-winning-bid uniform-price auction used in the current Treasury experiment, and the highest-losing-bid uniform-price auction proposed by Friedman almost four decades ago. We show that the Friedman auction generates the greatest expected revenue net of enforcement costs, the experimental Treasury auction generates less expected net revenue, and the pay-your-bid auction generates the least.

In the newsvendor problem, a decision maker facing random demand for a perishable product decides how much of it to stock for a single selling period. This simple problem with its intuitively appealing solution is a crucial building block of stochastic inventory theory, which comprises a vast literature focusing on operational efficiency. Typically in this literature, market parameters such as demand and selling price are exogenous. However, incorporating these factors into the model can provide an excellent vehicle for examining how operational problems interact with marketing issues to influence decision making at the firm level. In this paper we examine an extension of the newsvendor problem in which stocking quantity and selling price are set simultaneously. Drawing on a fragmented literature in operations management and economics, we provide a comprehensive review that synthesizes existing results for the single period problem and develop additional results to enrich the existing knowledge base. We also include a review of multiple period variants of this model and identify promising areas of research.

Optimal operating policies and corresponding managerial insight are developed for a firm that establishes dynamically a stocking level and a selling price for its product while exploiting information gathered from ongoing operations. Given a management situation in which the demand function depends on selling price and includes an unknown scale parameter, learning occurs as the firm monitors the market's response to its decisions and then updates its characterization of the demand function. Of primary interest is the effect of censored data since a firm's observations often are restricted to sales. We find that the first-period optimal selling price increases with the length of the problem horizon. However, for a given problem horizon, prices can rise or fall over time, depending on how the scale parameter influences demand. Further results include the characterization of the optimal stocking quantity decision and a computationally viable algorithm.

Adopting a network perspective, this research proposes that the firm's network of interfirm linkages (e.g., strategic alliances) is a significant influence on its resource and competitive flexibility. Based on evidence from the global steel industry, it is argued that network properties may be managerial "levers" for enhancing strategic flexibility.

This paper analyzes the effects of international airline codesharing on traffic levels and welfare. The benefits of codesharing arise because cooperative pricing of trips by the codesharing partners puts downward pressure on fares in the interline markets. The loss of competition in the interhub market, which connects the hub cities of the partners, generates a countervailing effect, tending to raise the fare in that market. Evaluating the net impact of these countervailing forces is not straightforward because the relevant markets are served via a hub-and-spoke network that carries endogenous traffic flows in many other markets. The analysis shows that the beneficial effect of codesharing outweights its harmful effect for most parameter values. .

Since their introduction to the social sciences in the 1960s, theories of innovation diffusion have sparked considerable research interest and resulted in an extensive literature. Although it is recognized that the diffusion of innovation within a system is fundamentally a process of communication within a social network, diffusion models have not directly incorporated the structural properties of a network. Based on the network and innovation literatures, we develop hypotheses linking structural properties to innovation and imitation potential. The hypothesized relationships form the basis for the analytical model developed in this paper, linking the diffusion parameters (innovation and imitation coefficients) to the structural properties of the network. Research and managerial implications of our model are discussed.


Though the prepurchase effects of advertising on children are well-documented, little is known about advertising's impact in consumption settings. Three studies, combining both experimentation and depth interviews are reported to examine this issue. Based on our preliminary findings, special emphasis was placed on the role affective constructs play in shaping children's impressions. Although theory suggests that as children mature they become increasingly willing to discount ad claims, experimental results indicated that it was older children (10-11 year-olds) who allowed advertisements to influence their interpretations and evaluations of trial experiences more so than younger children (7-8 year-olds). Depth interviews allowed us to enrich our understanding of this phenomenon. Implications of these unanticipated age-related findings are phenomenon. Implications of these unanticipated age-related findings are discussed and examined in light of opinions and reactions provided by a set of leading children's advertising researchers.

This study uses an agency and transaction cost approach to delineate the elements of new contract choices in crop production, and to test the approach based on responses by grain farmers to simulated decision situations in which preferences are ordered across contract choices that differ in asset specificity, uncertainty, and risk and cost sharing with contractors. The statistical results indicate that asset specificity significantly influences farmers' choice of contractual arrangements, while uncertainty, interactions between asset specificity and uncertainty, and selected farmer characteristics are significant in pricing behavior and the choice of hybrid contracts.


We consider multivariate tests for cointegration where there are additional cointegrating vectors under the alternative hypothesis. These cointegrating vectors are obscured by the existence of multiple breaks in the deterministic components of the process. The test is based on the suprmem of likelihood ratio statistics calculated over a subset of all possible break points. The asymptotic distribution of the proposed statistic is similar in structure to the likelihood ratio statistics of Johansen (1988, 1991). A Monte Carlo experiment is conducted to examine the size and power properties of the proposed tests. The experiment reveals that the proposed tests are more powerful than the likelihood ratio tests of Johansen (1991) when there is a break.