首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 53 毫秒
1.
This note describes how the incomplete markets model with aggregate uncertainty in Den Haan et al. [Comparison of solutions to the incomplete markets model with aggregate uncertainty. Journal of Economic Dynamics and Control, this issue] is solved using standard quadrature and projection methods. This is made possible by linking the aggregate state variables to a parameterized density that describes the cross-sectional distribution. A simulation procedure is used to find the best shape of the density within the class of approximating densities considered. This note compares several simulation procedures in which there is—as in the model—no cross-sectional sampling variation.  相似文献   

2.
We use a perturbation method to solve the incomplete markets model with aggregate uncertainty described in den Haan et al. [Computational suite of models with heterogeneous agents: incomplete markets and model uncertainty. Journal of Economic Dynamics & Control, this issue]. To apply that method, we use a “barrier method” to replace the original problem with occasionally binding inequality constraints by one with only equality constraints. We replace the structure with a continuum of agents by a setting in which a single infinitesimal agent faces prices generated by a representative-agent economy. We also solve a model variant with a large (but finite) number of agents. Our perturbation-based method is much simpler and faster than other methods.  相似文献   

3.
We propose a method to solve models with heterogeneous agents and aggregate uncertainty. The law of motion describing aggregate behavior is obtained by explicitly aggregating the individual policy rule. The algorithm is simpler and faster than existing algorithms that rely on parameterization of the cross-sectional distribution and/or a computationally intensive simulation step. Explicit aggregation establishes a link between the individual policy rule and the set of necessary aggregate state variables, an insight that can be helpful in determining what state variables to include in other algorithms as well.  相似文献   

4.
The decision maker receives signals imperfectly correlated with an unobservable state variable and must take actions whose payoffs depend on the state. The state randomly changes over time. In this environment, we examine the performance of simple linear updating rules relative to Bayesian learning. We show that a range of parameters exists for which linear learning results in exactly the same decisions as Bayesian learning, although not in the same beliefs. Outside this parameter range, we use simulations to demonstrate that the consumption level attainable under the optimal linear rule is virtually indistinguishable from the one attainable under Bayes’ rule, although the respective decisions will not always be identical. These results suggest that simple rules of thumb can have an advantage over Bayesian updating when more complex calculations are more costly to perform than less complex ones. We demonstrate the implications of such an advantage in an evolutionary model where agents “learn to learn.”  相似文献   

5.
Akihiro  Takeshi  Shoko   《Socio》2009,43(4):263-273
This paper presents a Data Envelopment Analysis/Malmquist index (DEA/MI) analysis of the change in quality-of-life (QOL), which is defined as the state of a social system as measured by multiple social-indicators. Applying panel data from Japan's 47 prefectures for the period 1975–2002, we identify significant movement in the country's overall QOL using a “cumulative” frontier shift index. Results suggest that Japan's QOL rose during the so-called “bubble economy years” (second half of the 1980s), and then dropped in the succeeding “lost-decade” (1990s). We also identify those prefectures considered most “responsible” for the shift(s) in QOL. Moreover, the use of both upper- and lower-bound DEAs enabled an evaluation of both “good” and “bad” movements in QOL.  相似文献   

6.
This article describes the approach to computing the version of the stochastic growth model with idiosyncratic and aggregate risk that relies on collapsing the aggregate state space down to a small number of moments used to forecast future prices. One innovation relative to most of the literature is the use of a non-stochastic simulation routine.  相似文献   

7.
In this paper we estimate a dynamic structural model of employment at firm level. Our dataset consists of a balanced panel of 2790 Greek manufacturing firms. The empirical evidence of this dataset stresses three important stylized facts: (a) there are periods in which firms decide not to change their labour input, (b) there are periods of large employment changes (lumpy nature of labour adjustment) and (c) the commonality is employment spikes to be followed by smooth and low employment growth periods. Following Cooper and Haltiwanger [Cooper, R.W. and Haltiwanger, J. “On the Nature of Capital Adjustment Costs”, Review of Economic Studies, 2006; 73(3); 611–633], we consider a dynamic discrete choice model of a general specification of adjustment costs including convex, non-convex and “disruption of production” components. We use a method of simulated moments procedure to estimate the structural parameters. Our results indicate considerable fixed costs in the Greek employment adjustment.  相似文献   

8.
This paper compares numerical solutions to the model of Krusell and Smith [1998. Income and wealth heterogeneity in the macroeconomy. Journal of Political Economy 106, 867–896] generated by different algorithms. The algorithms have very similar implications for the correlations between different variables. Larger differences are observed for (i) the unconditional means and standard deviations of individual variables, (ii) the behavior of individual agents during particularly bad times, (iii) the volatility of the per capita capital stock, and (iv) the behavior of the higher-order moments of the cross-sectional distribution. For example, the two algorithms that differ the most from each other generate individual consumption series that have an average (maximum) difference of 1.63% (11.4%).  相似文献   

9.
Household production, full consumption and the costs of children   总被引:1,自引:0,他引:1  
Recent work criticises both the logic and relevance of the theoretical basis of the approach to estimating the costs of raising children adopted in much of the economics literature. This tends to be restricted purely to models in which the household members consume market goods with given household income. The “costs of children” are perceived essentially as market consumption costs. This ignores the fact that an important, possibly preponderant element of child costs takes the form of parental time, which must be diverted from alternative uses such as market work, other household production activities, and leisure, to care for children. The studies also ignore the question of the differential incidence of child costs on adult members of the household. In this paper, we first of all argue that a satisfactory theoretical approach to modelling child costs must simultaneously incorporate an “individualistic” formulation of the household and a formal treatment of household production. We then provide such a model. Using data from a time use survey we estimate specialised versions of the model for families with two children and use the results to derive the intra-family distribution of resources and implied child-rearing costs.  相似文献   

10.
Do demand curves for stocks slope down?: Evidence from aggregate data   总被引:1,自引:0,他引:1  
We examine whether the aggregate demand curve for stocks is downward sloping. As a proxy for aggregate demand, we use net outflows (dividends plus repurchases less net issues) from the stock market scaled by the previous year's market capitalization. To disentangle the information and price pressure effects from the demand curve effects, we use an information-free demographic variable as an instrument and look at the relation between annual changes in aggregate demand and excess market return. We find that information-free changes in the annual aggregate demand for stocks do not lead to changes in the annual excess market return. This finding supports long-term horizontal demand curves for stocks.  相似文献   

11.
Only 10% of the results of consultations in primary care can be assigned to a confirmed diagnosis, while 50% remain “symptoms” and 40% are classified as “named syndromes” (“picture of a disease”). Moreover, less than 20% of the most frequent diagnoses account for more than 80% of the results of consultations. This finding, confirmed empirically during the last fifty years, suggests a power law distribution, with critical consequences for diagnosis and decision making in primary care.Our results prove that primary care has a severe “black swan” element in the vast majority of consultations. Some critical cases involving “avoidable life-threatening dangerous developments” (ALDD) such as myocardial disturbance, brain bleeding, and appendicitis may be masked by those often vague symptoms of health disorders ranked in the 20% most frequent diagnoses. The Braun distribution predicts the frequency of health disorders on a phenomenological level and reveals the “black swan” problem, but is not a tool by itself for arriving at accurate diagnoses. To improve predictions and enhance the reliability of diagnoses we propose standards of documentation and a systematic manner by which the risk facing a patient with an uncertain diagnosis can be evaluated (diagnostic protocols).Accepting a power law distribution in primary care implies the following: (1) primary care should no longer be defined only by “low prevalence” properties, but also by its black-swan-incidence-problem. This includes rethinking malpractice and the requirements of malpractice litigations; (2) at the level of everyday practice, diagnostic protocols are tools to make diagnoses more reliable; (3) at the level of epidemiology, Braun’s system of classification is useful for generating valid information by which predictions of risks can be improved.  相似文献   

12.
This paper studies the properties of the solution to the heterogeneous agents model in Den Haan et al. [2009. Computational suite of models with heterogeneous agents: incomplete markets and aggregate uncertainty. Journal of Economic Dynamics and Control, this issue]. To solve for the individual policy rules, we use an Euler-equation method iterating on a grid of pre-specified points. To compute the aggregate law of motion, we use the stochastic-simulation approach of Krusell and Smith [1998. Income and wealth heterogeneity in the macroeconomy. Journal of Political Economy 106, 868–896]. We also compare the stochastic- and non-stochastic-simulation versions of the Krusell–Smith algorithm, and we find that the two versions are similar in terms of their speed and accuracy.  相似文献   

13.
This paper deals with the issue of arbitrage with differential information and incomplete financial markets, with a focus on information that no-arbitrage asset prices can reveal. Time and uncertainty are represented by two periods and a finite set S of states of nature, one of which will prevail at the second period. Agents may operate limited financial transfers across periods and states via finitely many nominal assets. Each agent i has a private information about which state will prevail at the second period; this information is represented by a subset Si of S. Agents receive no wrong information in the sense that the “true state” belongs to the “pooled information” set ∩iSi, hence assumed to be non-empty.Our analysis is two-fold. We first extend the classical symmetric information analysis to the asymmetric setting, via a concept of no-arbitrage price. Second, we study how such no-arbitrage prices convey information to agents in a decentralized way. The main difference between the symmetric and the asymmetric settings stems from the fact that a classical no-arbitrage asset price (common to every agent) always exists in the first case, but no longer in the asymmetric one, thus allowing arbitrage opportunities. This is the main reason why agents may need to refine their information up to an information structure which precludes arbitrage.  相似文献   

14.
Despite its ability to produce optimal solutions, the Linear Decision Rule (LDR) has not had a significant impact in the business environment. The Production Switching Heuristic (PSH), which has shown promising results when compared with the LDR, has experienced some business application because of its practicability and flexibility. During aggregate production planning, forecast errors are almost unavoidable, but the sensitivity of these models to such errors has not been thoroughly tested. Insufficient attention has been paid to truly understand the cost effects of forecast errors and other important interactions. The study investigates these issues by analyzing the results of 740 simulated problems.Using the famous “paint factory” cost data, the sensitivity of the LDR and the PSH are examined under various experimental conditions. The factors controlled at different levels are: forecast error mean, forecast error standard deviation, demand pattern, demand variability, and cost coefficients. The results show that 1) the PSH is generally less sensitive than the LDR to forecast errors, 2) both forecast error mean and standard deviation effectively measure the severity of forecast errors, and 3) underforecasts cause less cost penalty than overforecasts.The outcome of the study has helpful managerial implications for aggregate planning related decisionmaking. It suggests that the use of the PSH could result in potential cost savings even if significant forecast errors are envisioned as long as the period-to-period demand variability is not substantially high. Also, BIAS warrants more attention than MSE in evaluating the extent of forecast errors and their eventual cost impact on aggregate production planning.  相似文献   

15.
In this paper, some new indices for ordinal data are introduced. These indices have been developed so as to measure the degree of concentration on the “small” or the “large” values of a variable whose level of measurement is ordinal. Their advantage in relation to other approaches is that they ascribe unequal weights to each class of values. Although, they constitute a useful tool in various fields of applications, the focus here is on their use in sample surveys and specifically in situations where one is interested in taking into account the “distance” of the responses from the “neutral” category in a given question. The properties of these indices are examined and methods for constructing confidence intervals for their actual values are discussed. The performance of these methods is evaluated through an extensive simulation study.  相似文献   

16.
The cause of the “housing bubble” associated with the sharp rise and then drop in home prices over the period 1998–2008 has been the focus of significant policy and research attention. The dramatic increase in subprime lending during this period has been broadly blamed for these market dynamics. In this paper we empirically investigate the validity of this hypothesis vs. several other alternative explanations. A model of house price dynamics over the period 1998–2006 is specified and estimated using a cross-sectional time-series data base across 20 metropolitan areas over the period 1998–2006. Results suggest that prior to early 2004, economic fundamentals provide the primary explanation for house price dynamics. Subprime credit activity does not seem to have had much impact on subsequent house price returns at any time during the observation period, although there is strong evidence of a price-boosting effect by investor loans. However, we do find strong evidence that a credit regime shift took place in late 2003, as the GSE’s were displaced in the market by private issuers of new mortgage products. Market fundamentals became insignificant in affecting house price returns, and the price-momentum conditions characteristic of a “bubble” were created. Thus, rather than causing the run-up in house prices, the subprime market may well have been a joint product, along with house price increases, (i.e., the “tail”) of the changing institutional, political, and regulatory environment characteristic of the period after late 2003 (the “dog”).  相似文献   

17.
This paper examines the lag distribution relating wholesale to consumer price changes. Sims' causality test indicates a one-sided lag structure. Following Hatanaka and Wallace, parameters of the lag distribution which can be estimated with relatively high precision are emphasized. Thus, our concern is with the sum of coefficients and the first four moments of the distribution. Short-run effects are estimated from lag moments using Pearson's method for equating moments. As a smoothness prior, a Beta distribution in the lag weights is suggested. Tests for bias due to missing components in the wholesale price index indicate little is lost because of misspecification.  相似文献   

18.
Megan's Law requires public dissemination of information from sex offender registries. Opponents to this controversial law have questioned whether households misinterpret or even use this information. One concern was that the information might simply induce a “fear of crime.” This study finds evidence for both use and misinterpretation of the publicly available information on sex offenders. Using a unique dataset that tracks sex offenders in Hillsborough County, Florida, the results indicate that after a sex offender moves into a neighborhood, nearby housing prices fall by 2.3% ($3500 on average). However, once a sex offender moves out of a neighborhood, housing prices appear to immediately rebound. Surprisingly, these price impacts do not appear to differ in areas near high risk offenders labeled as “predators.”  相似文献   

19.
Management is one of the few professions, the authors note, in which members have no formal “rehearsal space” for honing their skills. In response to this need, organizations such as The Center for Creative Leadership, MIT's Learning Center, and The Stern School of Business at New York University have created a brave new world of management simulations—“practice fields” for the learning organization. Some of these new games (“simuworlds”) use computer programs to replicate an entire industry and give participants an opportunity to play out one company's strategy in that setting. Other simulations (“microworlds”) engage participants in complex behavioral role playing, based on scenarios that typically develop within a company. Still other simulations combine both approaches. The authors take their point of departure from Peter Senge's definition of the “learning barriers” that develop in any organization: solving fragmented “problems” rather than dealing with systemic issues; overemphasis on competition at the expense of cooperation; and a failure to innovate until forced to do so. The new simulations, the authors argue, are particularly useful in helping managers learn how to overcome these barriers.  相似文献   

20.
Health insurance in the United States is typically acquired through an employer-sponsored program. Often employees offered employer-provided health insurance have the option to extend coverage to their spouse and dependents. We investigate the implications of the “publicness” of health insurance coverage for the labor market careers of spouses. The theoretical innovations in the paper are to extend the standard partial–partial equilibrium labor market search model to a multiple searcher setting with the inclusion of multi-attribute job offers, with some of the attributes treated as public goods within the household. The model is estimated using data from the Survey of Income and Program Participation (SIPP) using a Method of Simulated Moments (MSM) estimator. We demonstrate how previous estimates of the marginal willingness to pay (MWP) for health insurance based on cross-sectional linear regression estimators may be seriously biased due to the presence of dynamic selection effects and misspecification of the decision-making unit.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号