首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The stochastic specification of input-output response is examined. Postulates are set forth which seem reasonable on the basis of a priori theorizing and observed behavior. It is found that commonly used formulations are restrictive and may lead to inefficient and biased results. A function which satisfies the postulates is suggested. Two- and four-stage procedures for estimation of the resulting function are then outlined. The estimators are shown to be consistent and, in the latter case, asymptotically efficient under normality.  相似文献   

2.
Our purpose is to investigate the ability of different parametric forms to ‘correctly’ estimate consumer demands based on distance functions using Monte Carlo methods. Our approach combines economic theory, econometrics and quadratic approximation. We begin by deriving parameterizations for transformed quadratic functions which are linear in parameters and characterized by either homogeneity or which satisfy the translation property. Homogeneity is typical of Shephard distance functions and expenditure functions, whereas translation is characteristic of benefit/shortage or directional distance functions. The functional forms which satisfy these conditions and include both first- and second-order terms are the translog and quadratic forms, respectively. We then derive a primal characterization which is homogeneous and parameterized as translog and a dual model which satisfies the translation property and is specified as quadratic. We assess functional form performance by focusing on empirical violations of the regularity conditions. Our analysis corroborates results from earlier Monte Carlo studies on the production side suggesting that the quadratic form more closely approximates the ‘true’ technology or in our context consumer preferences than the translog.  相似文献   

3.
We address the issue of equivalence of primal and dual measures of scale efficiency in general production theory framework. We find that particular types of homotheticity of technologies, which we refer to here as scale homotheticity, provide necessary and sufficient condition for such equivalence. We also identify the case when the scale homotheticity is equivalent to the homothetic structures from Shephard (Theory of cost and production functions, Princeton studies in mathematical economics. Princeton University Press, Princeton, 1970).  相似文献   

4.
A regression discontinuity (RD) research design is appropriate for program evaluation problems in which treatment status (or the probability of treatment) depends on whether an observed covariate exceeds a fixed threshold. In many applications the treatment-determining covariate is discrete. This makes it impossible to compare outcomes for observations “just above” and “just below” the treatment threshold, and requires the researcher to choose a functional form for the relationship between the treatment variable and the outcomes of interest. We propose a simple econometric procedure to account for uncertainty in the choice of functional form for RD designs with discrete support. In particular, we model deviations of the true regression function from a given approximating function—the specification errors—as random. Conventional standard errors ignore the group structure induced by specification errors and tend to overstate the precision of the estimated program impacts. The proposed inference procedure that allows for specification error also has a natural interpretation within a Bayesian framework.  相似文献   

5.
Stochastic specification and the estimation of share equations   总被引:1,自引:0,他引:1  
The standard stochastic specification for a system of share equations is obtained by assuming that the shares have a joint normal distribution with means depending upon exogenous variables and a constant covariance matrix. This specification ignores the requirement that shares be between zero and unity by giving positive probability to shares outside this range. An alternative stochastic specification involving the Dirichlet distribution, which automatically limits shares to the unit simplex, is suggested. A comparison of results obtained from the two specifications is made using sampling experiments and data from three different empirical studies. The sampling experiments and empirical applications show that the results are generally quite close, thus providing some justification for the continued use of the normal distribution specification in the estimation of share equations.  相似文献   

6.
Maximum likelihood estimation and most statistical tests require a full specification of the error distribution in a model. Under suitable parametric restrictions we can derive least informative specifications. The autoregressive processes prove to be least informative under a few simple variance and covariance restrictions. For the singular multivariate error distribution in a sum-constrained model, several least informative error distributions are obtained using different parametric assumptions on the covariance structure. A combined maximum entropy — maximum likelihood approach provides an alternative to other recent proposals for covariance estimation in small samples.  相似文献   

7.
This paper investigates the properties of a linearized stochastic volatility (SV) model originally from Harvey et al. (Rev Econ Stud 61:247–264, 1994) under an extended flexible specification (discrete mixtures of normal). General closed form expressions for the moment conditions are derived. We show that our proposed model captures various tail behavior in a more flexible way than the Gaussian SV model, and it can accommodate certain correlation structure between the two innovations. Rather than using likelihood-based estimation methods via MCMC, we use an alternative procedure based on the characteristic function (CF). We derive analytical expressions for the joint CF and present our estimator as the minimizer of the weighted integrated mean-squared distance between the joint CF and its empirical counterpart (ECF). We complete the paper with an empirical application of our model to three stock indices, including S&P 500, Dow Jones 30 Industrial Average index and Nasdaq Composite index. The proposed model captures the dynamics of the absolute returns well and presents some consistent and supportive evidence for the Taylor effect and Machina effect.  相似文献   

8.
9.
10.
Model specification for state space models is a difficult task as one has to decide which components to include in the model and to specify whether these components are fixed or time-varying. To this aim a new model space MCMC method is developed in this paper. It is based on extending the Bayesian variable selection approach which is usually applied to variable selection in regression models to state space models. For non-Gaussian state space models stochastic model search MCMC makes use of auxiliary mixture sampling. We focus on structural time series models including seasonal components, trend or intervention. The method is applied to various well-known time series.  相似文献   

11.
Jorge Navarro 《Metrika》2018,81(4):465-482
The study of stochastic comparisons of coherent systems with different structures is a relevant topic in reliability theory. Several results have been obtained for specific distributions. The present paper is focused on distribution-free comparisons, that is, orderings which do not depend on the component distributions. Different assumptions for the component lifetimes are considered which lead us to different comparison techniques. Thus, if the components are independent and identically distributed (IID) or exchangeable, the orderings are obtained by using signatures. If they are just ID (homogeneous components), then ordering results for distorted distributions are used. In the general case or in the case of independent (heterogeneous) components, a similar technique based on generalized distorted distributions is applied. In these cases, the ordering results may depend on the copula used to model the dependence between the component lifetimes. Some illustrative examples are included in each case.  相似文献   

12.
In this paper the author identifies and examines the estimation and specification error biases of the Black-Scholes and Cox-Ross models by using both analytical and monte-carlo simulation techniques. Several hypotheses are tested. The central hypothesis is whether or not the estimation error bias in the correctly specified model is large enough to make researchers mistakenly pick the “wrong” model as being the “correct” one. The findings in this study support this hypothesis. It is shown that there is a bias toward accepting the Cox-Ross model as correct, even if the Black-Scholes is assumed to be the true model for pricing options.  相似文献   

13.
In recent years, it has become customary to derive dynamic behavioral relationships from intertemporal utility maximization. Since the planning interval for which decisions are made is unknown, a likely specification error in dynamic models is that the period for which data are available is longer than the planning interval. This paper brings out the consequences of this specification error for the estimation of dynamic models derived from intertemporal utility maximization.  相似文献   

14.
15.
Multiple-output models of Canadian telecommunications production are estimated under different production equilibria. A specification test is conducted between the short- and long-run equilibrium models and the long-run equilibrium is rejected. In order to capture the nature of the disequilibrium, a dynamic cost of adjustment model is estimated for Bell Canada. There are significant adjustment costs and it is estimated that for $1.00 of marginal capital costs the carrier must incur an additional cost of $0.30 to install the new capital into the production process.  相似文献   

16.
Analysis of the behavior of technical inefficiency with respect to parameters and variables of a stochastic frontier model is a neglected area of research in frontier literature. An attempt in this direction, however, has recently been made. It has been shown that in a “standard” stochastic frontier model that both the firm level technical inefficiency and the production uncertainty are monotonically decreasing with observational error. In this paper we show, considering a stochastic frontier model whose error components are jointly distributed as truncated bivariate normal, that this property holds if and only if the distribution of observational error is negatively skewed. We also derive a necessary and sufficient condition under which both firm level technical inefficiency and production uncertainty are monotonically increasing with noise-inefficiency correlation. We next propose a new measure of the industry level production uncertainty and establish the necessary and sufficient condition for firm level technical inefficiency and production uncertainty to be monotonically increasing with industry level production uncertainty. We also study the limiting probabilistic behavior of these conditions under different parametric configuration of our model. Finally we carry out Monte Carlo simulations to study the sample behavior of the population monotonic property of the firm level technical inefficiency and production uncertainty in our model.
Arabinda DasEmail:
  相似文献   

17.
《Journal of econometrics》2005,128(1):165-193
We analyze OLS-based tests of long-run relationships, weak exogeneity and short-run dynamics in conditional error correction models. Unweighted sums of single equation test statistics are used for hypothesis testing in pooled systems. When model errors are (conditionally) heteroskedastic tests of weak exogeneity and short run dynamics are affected by nuisance parameters. Similarly, on the pooled level the advocated test statistics are no longer pivotal in presence of cross-sectional error correlation. We prove that the wild bootstrap provides asymptotically valid critical values under both conditional heteroskedasticity and cross-sectional error correlation. A Monte-Carlo study reveals that in small samples the bootstrap outperforms first-order asymptotic approximations in terms of the empirical size even if the asymptotic distribution of the test statistic does not depend on nuisance parameters. Opposite to feasible GLS methods the approach does not require any estimate of cross-sectional correlation and copes with time-varying patterns of contemporaneous error correlation.  相似文献   

18.
ABSTRACT

Recent advances in multi-region input-output (IO) table construction have led to large databases becoming available. Some of these databases currently demand too much computer memory or user cognition to be handled effectively outside high-performance environments, especially for applications such as virtual laboratories, computable general equilibrium modelling, linear programming, series expansion, or structural decomposition analysis, thus inhibiting their widespread use by analysts and decision-makers. Aggregation is an obvious solution; but there is a need for structured approaches to aggregating an IO system in a way that does not compromise the ability to effectively answer the research question at hand. In this article, I describe how structural path analysis can be used to realise a computationally inexpensive method for aggregating IO systems whilst minimising aggregation errors. I show that there exists no one-fits-all strategy, but that optimal aggregation depends on the research question at hand.  相似文献   

19.
Abstract  In modeling real world planning problems as optimization programs the assumption that all parameters are known with certainty is often more seriously violated than the assumption that the objective function and the constraints can be approximated sufficiently accurate by lineair functions. In this paper we discuss the concrete application of stochastic programming to a multiperiod production planning problem in which the demand for the products during the various periods is assumed stochastic with known probability distribution. Since the resulting stochastic program does not possess the property of "simple" recourse no direct use can be made of existing methods that have been proposed in literature for solving problems of this type.  相似文献   

20.
This paper presents a contingency model designed to explain aggregate differences in the specifications for production planning and control systems across firms. The specifications for a firm's system, stated broadly in terms of the investment in information processing systems and organizational integrativeness, are related to that firm's competitive strategy and environment. An analysis of data from a limited sample of companies, and a panel of managers, provides insights into the applicability of the concepts used in the model, and its potential usefulness.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号