首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
We describe procedures for Bayesian estimation and testing in cross-sectional, panel data and nonlinear smooth coefficient models. The smooth coefficient model is a generalization of the partially linear or additive model wherein coefficients on linear explanatory variables are treated as unknown functions of an observable covariate. In the approach we describe, points on the regression lines are regarded as unknown parameters and priors are placed on differences between adjacent points to introduce the potential for smoothing the curves. The algorithms we describe are quite simple to implement—for example, estimation, testing and smoothing parameter selection can be carried out analytically in the cross-sectional smooth coefficient model.  相似文献   

2.
In the spirit of Smale’s work, we consider pure exchange economies with general consumption sets. In this paper, the consumption set of each household is described in terms of a function called possibility function. The main innovation comes from the dependency of each possibility function with respect to the individual endowments. We prove that, generically in the space of endowments and possibility functions, economies are regular. A regular economy has a finite number of equilibria, which locally depend on endowments and possibility functions in a continuous manner.  相似文献   

3.
Bayesian model selection with posterior probabilities and no subjective prior information is generally not possible because of the Bayes factors being ill‐defined. Using careful consideration of the parameter of interest in cointegration analysis and a re‐specification of the triangular model of Phillips (Econometrica, Vol. 59, pp. 283–306, 1991), this paper presents an approach that allows for Bayesian comparison of models of cointegration with ‘ignorance’ priors. Using the concept of Stiefel and Grassman manifolds, diffuse priors are specified on the dimension and direction of the cointegrating space. The approach is illustrated using a simple term structure of the interest rates model.  相似文献   

4.
The objective of this paper is to identify variational preferences and multiple-prior (maxmin) expected utility functions that exhibit aversion to risk under some probability measure from among the priors. Risk aversion has profound implications on agents’ choices and on market prices and allocations. Our approach to risk aversion relies on the theory of mean-independent risk of Werner (2009). We identify necessary and sufficient conditions for risk aversion of convex variational preferences and concave multiple-prior expected utilities. The conditions are stability of the cost function and of the set of probability priors, respectively, with respect to a probability measure. The two stability properties are new concepts. We show that cost functions defined by the relative entropy distance or other divergence distances have that property. Set of priors defined as cores of convex distortions of probability measures or neighborhoods in divergence distances have that property, too.  相似文献   

5.
We propose a general two-step estimator for a popular Markov discrete choice model that includes a class of Markovian games with continuous observable state space. Our estimation procedure generalizes the computationally attractive methodology of Pesendorfer and Schmidt-Dengler (2008) that assumed finite observable states. This extension is non-trivial as the policy value functions are solutions to some type II integral equations. We show that the inverse problem is well-posed. We provide a set of primitive conditions to ensure root-T consistent estimation for the finite dimensional structural parameters and the distribution theory for the value functions in a time series framework.  相似文献   

6.
This paper analyzes the properties of a class of estimators, tests, and confidence sets (CSs) when the parameters are not identified in parts of the parameter space. Specifically, we consider estimator criterion functions that are sample averages and are smooth functions of a parameter θθ. This includes log likelihood, quasi-log likelihood, and least squares criterion functions.  相似文献   

7.
The existence of stationary processes of temporary equilibria is examined in an OLG model, where there are finitely many commodities and consumers in each period, and endowments profiles and expectations profiles are subject to stochastic shocks. A state space is taken as the set of all payoff-relevant variables, and dynamics of the economy is captured as a stochastic process in the state space. In our model, however, the state space does not necessarily admit a compact-truncation consistent with the intertemporal restrictions because distributions over expectations profiles may have non-compact supports. As shown in Duffie et al. [Duffie, D., Geanakoplos, J., Mas-Colell, A., McLennan, A., 1994. Stationary Markov equilibria. Econometrica 62, 745–781), such a compact-truncation, called a self-justified set, is essential for the existence of stationary Markov equilibria. We extend their existence theorem so as to be applicable to our model.  相似文献   

8.
We consider classes of multivariate distributions which can model skewness and are closed under orthogonal transformations. We review two classes of such distributions proposed in the literature and focus our attention on a particular, yet quite flexible, subclass of one of these classes. Members of this subclass are defined by affine transformations of univariate (skewed) distributions that ensure the existence of a set of coordinate axes along which there is independence and the marginals are known analytically. The choice of an appropriate m-dimensional skewed distribution is then restricted to the simpler problem of choosing m univariate skewed distributions. We introduce a Bayesian model comparison setup for selection of these univariate skewed distributions. The analysis does not rely on the existence of moments (allowing for any tail behaviour) and uses equivalent priors on the common characteristics of the different models. Finally, we apply this framework to multi-output stochastic frontiers using data from Dutch dairy farms.  相似文献   

9.
The problem of irreversible investment with idiosyncratic risk is studied by interpreting market incompleteness as a source of ambiguity over the appropriate no-arbitrage discount factor. The maxmin utility over multiple priors framework is used to model and solve the irreversible investment problem. Multiple priors are modeled using the notion of κ‐ignorance. This set-up is used to analyze finitely lived options. For infinitely lived options the notion of constant κ‐ignorance is introduced. For these sets of density generators the corresponding optimal stopping problem is solved for general (in-)finite horizon optimal stopping problems driven by geometric Brownian motion. It is argued that an increase in the set of priors delays investment, whereas an increase in the degree of market completeness can have a non-monotonic effect on investment.  相似文献   

10.
Many structural break and regime-switching models have been used with macroeconomic and financial data. In this paper, we develop an extremely flexible modeling approach which can accommodate virtually any of these specifications. We build on earlier work showing the relationship between flexible functional forms and random variation in parameters. Our contribution is based around the use of priors on the time variation that is developed from considering a hypothetical reordering of the data and distance between neighboring (reordered) observations. The range of priors produced in this way can accommodate a wide variety of nonlinear time series models, including those with regime-switching and structural breaks. By allowing the amount of random variation in parameters to depend on the distance between (reordered) observations, the parameters can evolve in a wide variety of ways, allowing for everything from models exhibiting abrupt change (e.g. threshold autoregressive models or standard structural break models) to those which allow for a gradual evolution of parameters (e.g. smooth transition autoregressive models or time varying parameter models). Bayesian econometric methods for inference are developed for estimating the distance function and types of hypothetical reordering. Conditional on a hypothetical reordering and distance function, a simple reordering of the actual data allows us to estimate our models with standard state space methods by a simple adjustment to the measurement equation. We use artificial data to show the advantages of our approach, before providing two empirical illustrations involving the modeling of real GDP growth.  相似文献   

11.
We develop new methods for representing the asset-pricing implications of stochastic general equilibrium models. We provide asset-pricing counterparts to impulse response functions and the resulting dynamic value decompositions (DVDs). These methods quantify the exposures of macroeconomic cash flows to shocks over alternative investment horizons and the corresponding prices or investors’ compensations. We extend the continuous-time methods developed in Hansen and Scheinkman (2012) and Borovi?ka et al. (2011) by constructing discrete-time, state-dependent, shock-exposure and shock-price elasticities as functions of the investment horizon. Our methods are applicable to economic models that are nonlinear, including models with stochastic volatility.  相似文献   

12.
In this paper we compare classical econometrics, calibration and Bayesian inference in the context of the empirical analysis of factor demands. Our application is based on a popular flexible functional form for the firm's cost function, namely Diewert's Generalized Leontief function, and uses the well-known Berndt and Wood 1947–1971 KLEM data on the US manufacturing sector. We illustrate how the Gibbs sampling methodology can be easily used to calibrate parameter values and elasticities on the basis of previous knowledge from alternative studies on the same data, but with different functional forms. We rely on a system of mixed non-informative diffuse priors for some key parameters and informative tight priors for others. Within the Gibbs sampler, we employ rejection sampling to incorporate parameter restrictions, which are suggested by economic theory but in general rejected by economic data. Our results show that values of those parameters that relate to non-informative priors are almost equal to the standard SUR estimates, whereas differences come out for those parameters to which we have assigned informative priors. Moreover, discrepancies can be appreciated in some crucial parameter estimates obtained with or without rejection sampling.  相似文献   

13.
In this work, we analyze the performance of production units using the directional distance function which allows to measure the distance to the frontier of the production set along any direction in the inputs/outputs space. We show that this distance can be expressed as a simple transformation of radial or hyperbolic distance. This formulation allows to define robust directional distances in the lines of α-quantile or order-m partial frontiers and also conditional directional distance functions, conditional to environmental factors. We propose simple methods of estimation and derive the asymptotic properties of our estimators.  相似文献   

14.
We propose a natural conjugate prior for the instrumental variables regression model. The prior is a natural conjugate one since the marginal prior and posterior of the structural parameter have the same functional expressions which directly reveal the update from prior to posterior. The Jeffreys prior results from a specific setting of the prior parameters and results in a marginal posterior of the structural parameter that has an identical functional form as the sampling density of the limited information maximum likelihood estimator. We construct informative priors for the Angrist–Krueger [1991. Does compulsory school attendance affect schooling and earnings? Quarterly Journal of Economics 106, 979–1014] data and show that the marginal posterior of the return on education in the US coincides with the marginal posterior from the Southern region when we use the Jeffreys prior. This result occurs since the instruments are the strongest in the Southern region and the posterior using the Jeffreys prior, identical to maximum likelihood, focusses on the strongest available instruments. We construct informative priors for the other regions that make their posteriors of the return on education similar to that of the US and the Southern region. These priors show the amount of prior information needed to obtain comparable results for all regions.  相似文献   

15.
We study the problem of building confidence sets for ratios of parameters, from an identification robust perspective. In particular, we address the simultaneous confidence set estimation of a finite number of ratios. Results apply to a wide class of models suitable for estimation by consistent asymptotically normal procedures. Conventional methods (e.g. the delta method) derived by excluding the parameter discontinuity regions entailed by the ratio functions and which typically yield bounded confidence limits, break down even if the sample size is large ( Dufour, 1997). One solution to this problem, which we take in this paper, is to use variants of  Fieller’s ( 1940, 1954) method. By inverting a joint test that does not require identifying the ratios, Fieller-based confidence regions are formed for the full set of ratios. Simultaneous confidence sets for individual ratios are then derived by applying projection techniques, which allow for possibly unbounded outcomes. In this paper, we provide simple explicit closed-form analytical solutions for projection-based simultaneous confidence sets, in the case of linear transformations of ratios. Our solution further provides a formal proof for the expressions in Zerbe et al. (1982) pertaining to individual ratios. We apply the geometry of quadrics as introduced by  and , in a different although related context. The confidence sets so obtained are exact if the inverted test statistic admits a tractable exact distribution, for instance in the normal linear regression context. The proposed procedures are applied and assessed via illustrative Monte Carlo and empirical examples, with a focus on discrete choice models estimated by exact or simulation-based maximum likelihood. Our results underscore the superiority of Fieller-based methods.  相似文献   

16.
The paper develops a novel testing procedure for hypotheses on deterministic trends in a multivariate trend stationary model. The trends are estimated by the OLS estimator and the long run variance (LRV) matrix is estimated by a series type estimator with carefully selected basis functions. Regardless of whether the number of basis functions K is fixed or grows with the sample size, the Wald statistic converges to a standard distribution. It is shown that critical values from the fixed-K asymptotics are second-order correct under the large-K asymptotics. A new practical approach is proposed to select K that addresses the central concern of hypothesis testing: the selected smoothing parameter is testing-optimal in that it minimizes the type II error while controlling for the type I error. Simulations indicate that the new test is as accurate in size as the nonstandard test of Vogelsang and Franses (2005) and as powerful as the corresponding Wald test based on the large-K asymptotics. The new test therefore combines the advantages of the nonstandard test and the standard Wald test while avoiding their main disadvantages (power loss and size distortion, respectively).  相似文献   

17.
We develop a general procedure to construct pairwise meeting processes characterized by two features. First, in each period the process maximizes the number of matches in the population. Second, over time agents meet everybody else exactly once. We call this type of meetings “absolute strangers.” Our methodological contribution to economics is to offer a simple procedure to construct a type of decentralized trading environments usually employed in both theoretical and experimental economics. In particular, we demonstrate how to make use of the mathematics of Latin squares to enrich the modeling of matching economies.  相似文献   

18.
A major aim in recent nonparametric frontier modeling is to estimate a partial frontier well inside the sample of production units but near the optimal boundary. Two concepts of partial boundaries of the production set have been proposed: an expected maximum output frontier of order m=1,2,… and a conditional quantile-type frontier of order α∈]0,1]. In this paper, we answer the important question of how the two families are linked. For each m, we specify the order α for which both partial production frontiers can be compared. We show that even one perturbation in data is sufficient for breakdown of the nonparametric order-m frontiers, whereas the global robustness of the order-α frontiers attains a higher breakdown value. Nevertheless, once the α frontiers break down, they become less resistant to outliers than the order-m frontiers. Moreover, the m frontiers have the advantage to be statistically more efficient. Based on these findings, we suggest a methodology for identifying outlying data points. We establish some asymptotic results, contributing to important gaps in the literature. The theoretical findings are illustrated via simulations and real data.  相似文献   

19.
We study Pareto efficiency in a setting that involves two kinds of uncertainty: Uncertainty over the possible outcomes is modeled using lotteries whereas uncertainty over the agents’ preferences over lotteries is modeled using sets of plausible utility functions. A lottery is universally Pareto undominated if there is no other lottery that Pareto dominates it for all plausible utility functions. We show that, under fairly general conditions, a lottery is universally Pareto undominated iff it is Pareto efficient for some vector of plausible utility functions, which in turn is equivalent to affine welfare maximization for this vector. In contrast to previous work on linear utility functions, we use the significantly more general framework of skew-symmetric bilinear (SSB) utility functions as introduced by Fishburn (1982). Our main theorem generalizes a theorem by Carroll (2010) and implies the ordinal efficiency welfare theorem. We discuss three natural classes of plausible utility functions, which lead to three notions of ordinal efficiency, including stochastic dominance efficiency, and conclude with a detailed investigation of the geometric and computational properties of these notions.  相似文献   

20.
This paper introduces a general framework for the fair allocation of indivisible objects when each agent can consume at most one (e.g., houses, jobs, queuing positions) and monetary compensations are possible. This framework enables us to deal with identical objects and monotonicity of preferences in ranking objects. We show that the no-envy solution is the only solution satisfying equal treatment of equals, Maskin monotonicity, and a mild continuity property. The same axiomatization holds if the continuity property is replaced by a neutrality property.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号