首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 703 毫秒
1.
It is plain that the Austrian revival that began in the 1970s has yet to succeed in convincing the mainstream of the academy to jettison their physics-based mathematical models in favor of the sort of models and forms of argumentation that contemporary Austrians advocate. Agent-based computational modeling is still in its relative infancy but is beginning to gain recognition among economists disenchanted with the neoclassical paradigm. The purpose of this paper is to assuage concerns that readers might have regarding methodological consistency between agent-based modeling and Austrian economics and to advocate its adoption as a means to convey Austrian ideas to a wider audience. I examine models developed and published by other researchers and ultimately provide an outline of how one might develop a research agenda that leverages this technique. I argue that agent-based modeling can be used to enhance Austrian theorizing and offers a viable alternative to the neoclassical paradigm.  相似文献   

2.
Robust political economy emphasizes the lack of benevolence and omniscience of would be reformers. In addition, we consider the effects of biased decision-making for the robustness of the policy implications. This paper examines the robustness of the policy implications of models based on coordination failures and poverty traps. In particular, we address the revival in ‘big push’ type models and its policy implications. We argue that attempts to promote economic development through ‘big push’ models lack robustness. JEL Code O1, O20, P26, P41  相似文献   

3.
Although equilibrium allocations in models with incomplete markets are generally not Pareto-efficient, it is often argued that quantitative welfare losses from missing assets are small when time horizons are long and shocks are transitory. In this paper we use a computational analysis to show that even in the simplest infinite horizon model without aggregate uncertainty welfare losses can be substantial. Furthermore we show that in this model welfare losses from incomplete markets do not necessarily disappear when one considers calibrations of the model in which agents become very patient. We argue that when the economic model is calibrated to higher frequency data, the period persistence of negative income shocks must increase as well. In this case the welfare loss of incomplete markets remains constant even as agents' rate of time preference tends to one. Journal of Economic Literature Classification Numbers: D52, D58, D60.  相似文献   

4.
ABSTRACT: In this paper we argue that national accounting categories provide an inadequate basis for evaluating differences between public and private sector services. This is because accounting categories rely on economic concepts such as market price but do not take account of substantive public policy goals such as universality. The argument has important consequences for the structures and systems of delivery especially where nonprofit providers and social enterprise models are substituted for public bodies formerly integrated into the government's delivery system. Using an example taken from the UK's National Health Service, we show that the mechanisms for ensuring universality through redistribution are not sufficiently taken into account for classification purposes.  相似文献   

5.
Structural Vector Autoregressions with a differenced specification of hours (DSVAR) suggest that productivity shocks identified using long-run restrictions lead to a persistent and significant decline in hours worked. Economists have interpreted this evidence as showing that standard business cycle models in which a positive technology shock leads to a rise in hours are inconsistent with the data. In this paper we argue that such a conclusion is unwarranted because model's data and actual data are not treated symmetrically. To illustrate this problem, we estimate and test a flexible-price DSGE model with non-stationary hours using Indirect Inference on impulse responses of hours and output after technology and non-technology shocks. We find that, once augmented with a moderate amount of real frictions, the model can mimic well impulse responses obtained from a DSVAR on actual data. Using this model as a data generating process, we show that our estimation method is less subject to bias than a method that would directly compare theoretical responses with responses from the DSVAR.  相似文献   

6.
Innovation diffusion processes are generally described at aggregate level with models like the Bass Model (BM) and the Generalized Bass Model (GBM). However, the recognized importance of communication channels between agents has recently suggested the use of agent-based models, like Cellular Automata. We argue that an adoption or purchase process is nested in a communication network that evolves dynamically and indirectly generates a latent non-constant market potential affecting the adoption phase.Using Cellular Automata we propose a two-stage model of an innovation diffusion process. First we describe a communication network, an Automata Network, necessary for the “awareness” of an innovation. Then, we model a nested process depicting the proper purchase dynamics. Through a mean field approximation we propose a continuous representation of the discrete time equations derived by our nested two-stage model. This constitutes a special non-autonomous Riccati equation, not yet described in well-known international catalogues. The main results refer to the closed form solution that includes a general dynamic market potential and to the corresponding statistical analysis for identification and inference. We discuss an application to the diffusion of a new pharmaceutical drug.  相似文献   

7.
Australia's Privacy Act 1988 is under review with a view to bringing Australia's privacy laws into the digital era, more in line with the European Union's General Data Protection Regulation (GDPR). This article discusses how the GDPR can be refined and standardised to be more effective in protecting privacy in the digital era while not adversely affecting the digital economy that relies heavily on data. We argue that an ideal data policy should be informative and transparent about potential privacy costs while giving consumers a menu of opt-in choices into which they can self-select.  相似文献   

8.
We propose a simple method to identify the effects of unilateral and non‐discriminatory trade policies on bilateral trade within a theoretically consistent empirical gravity model. Specifically, we argue that structural gravity estimations should be performed with data that include not only international trade flows but also intra‐national trade flows. The use of intra‐national sales allows identification of the effects of non‐discriminatory trade policies such as most favoured nation tariffs, even in the presence of exporter and importer fixed effects. A byproduct of our approach is that it can be used to recover estimates of the trade elasticity, a key parameter for quantitative trade models. We demonstrate the effectiveness of our techniques in the case of most favoured nation tariffs and “time to export” as representative non‐discriminatory determinants of trade on the importer and on the exporter side, respectively. Our methods can be extended to quantify the impact on trade of any country‐specific characteristics as well as any non‐trade policies.  相似文献   

9.
We show that firms can employ data‐driven methods to improve their hiring decisions. Specifically, we use data available to National Football League (NFL) teams prior to the NFL draft to estimate econometric models that predict the future performance of drafted quarterbacks. As our methods are replicable, stakeholders can use them to improve the draft's efficiency and help it accomplish its mission to promote competitive balance. Furthermore, data‐driven methods such as ours can help firms avoid biases against employee characteristics that do not affect future job performance. (JEL L83)  相似文献   

10.
Research into predictive accuracy testing remains at the forefront of the forecasting field. One reason for this is that rankings of predictive accuracy across alternative models, which under misspecification are loss function dependent, are universally utilized to assess the usefulness of econometric models. A second reason, which corresponds to the objective of this paper, is that researchers are currently focusing considerable attention on so‐called big data and on new (and old) tools that are available for the analysis of this data. One of the objectives in this field is the assessment of whether big data leads to improvement in forecast accuracy. In this survey paper, we discuss some of the latest (and most interesting) methods currently available for analyzing and utilizing big data when the objective is improved prediction. Our discussion includes a summary of various so‐called dimension reduction, shrinkage and machine learning methods as well as a summary of recent tools that are useful for ranking prediction models associated with the implementation of these methods. We also provide a brief empirical illustration of big data in action, in which we show that big data are indeed useful when predicting the term structure of interest rates.  相似文献   

11.
ABSTRACT

I consider recent strategies proposed by econometricians for extrapolating causal effects from experimental to target populations. I argue that these strategies fall prey to the extrapolator’s circle: they require so much knowledge about the target population that the causal effects to be extrapolated can be identified from information about the target alone. I then consider comparative process tracing (CPT) as a potential remedy. Although specifically designed to evade the extrapolator’s circle, I argue that CPT is unlikely to facilitate extrapolation in typical econometrics and evidence-based policy applications. To argue this, I offer a distinction between two kinds of extrapolation, attributive and predictive, the latter being prevalent in econometrics and evidence-based policy. I argue that CPT is not helpful for predictive extrapolation when using the kinds of evidence that econometricians and evidence-based policy researchers prefer. I suggest that econometricians may need to consider qualitative evidence to overcome this problem.  相似文献   

12.
The Lucas Paradox observes that capital flows predominantly to relatively rich countries, contradicting the neoclassical prediction that it should flow to poorer capital-scarce countries. In an influential study, Alfaro, Kalemli-Ozcan, and Volosovych (AKV) argue that cross-country variation in institutional quality can fully explain the Paradox, contending that if institutional quality is included in regression models explaining international capital inflows, a country’s level of economic development is no longer statistically significant. We replicate AKV’s results using their cross-sectional IFS capital flow data. Motivated by the importance of conducting inference in statistically adequate models, we focus on misspecification testing of alternative functional forms of their empirical model of capital flows. We show that their resolution of the Paradox relies on inference in a misspecified model. In models that do not fail basic misspecification tests, even though institutional quality is a significant determinant of capital inflows, a country’s level of economic development also remains a significant predictor. The same conclusions are reached using an extended dataset covering more recent IFS international capital flow data, first-differenced capital stock data and additional controls.  相似文献   

13.
Abstract:

In this paper, we apply Celso Furtado’s vision of the process of economic development to the United States’ economy. Furtado was a creator of Latin American structuralism and continues to be one of the region’s most influential economists. Yet, he is little known in the English literature. As we argue, there are few academics who offer a theoretical framework capable of robustly evaluating the current trajectory of U.S. economic development with the depth of Furtado. Through his analytical lens, and with some help from John Maynard Keynes, we examine the present reality, as well as the more remote economic history of the US. We argue that, seen through Furtado’s lens, the US can now be accurately described as an under-developing economy.  相似文献   

14.
We examine whether the Fama and French (1992) (F&F) model can be adapted to become a more versatile and flexible tool, capable of incorporating variations of company characteristics in a more dynamic form. For this, the risk factors are reconstructed at the end of each reading of monthly data. We argue that, over time, the evaluation of a company may change as a result of variations in its market price, size or book price, and we are aware that the F&F model does not accurately reflect these dynamics. Our results show that the adapted model is able to capture the behaviour of a greater number of stocks than the original F&F model and risk factors are more significant when building them through our procedure. In addition, we carry out these adaptations during a period of instability in financial markets.  相似文献   

15.
In this comment, we answer the question posed in Svensson’s (2000) paper ‘Does the P* Model Provide any Rationale for Monetary Targeting?’– in contrast to him ‐‐ in the affirmative. We argue that a strategy of monetary targeting can be rationalized within the P* framework. Furthermore, we demonstrate that money growth targeting is a special form of inflation forecast targeting based on a ‘limited’ information set. In contrast to ‘full information’ inflation forecast targeting, monetary growth targeting is likely to be more robust under changing conditions of the real world.  相似文献   

16.
The paper presents a brief overview of the basic premise of the Burczak's Socialism after Hayek, and shows that Burczak's “applied epistemological postmodernism” presents a unique unifying ground for heterodox economics, breaking down traditional barriers between right and left. This new approach allows us to revisit the Marx-Keynes-Hayek debates in a more constructive way for a unified theory of social justice. However, we argue that Burczak's system does not automatically guarantee full employment, so it cannot be considered an ideal theory of social justice. A Post Keynesian contribution is presented in the form of the Employer of Last Resort (ELR) program which we argue is compatible and complementary to Burczak's theory of social justice. Finally, we argue that an adequate system design of the magnitude proposed here must be infomed by the principles of institutional adjustment as outlined by J. Fagg Foster.  相似文献   

17.
In this article we present a simple real business cycle (RBC) model, in order to show that these models capture many of the features of business cycles in the real economy. While these models are very abstract, we argue that they are a useful way of thinking about the macro-economy. RBC models have also been influential in refocusing attention on supply issues in macroeconomics, after a long postwar focus on aggregate demand management in Australia and most other western economies. Policies such as structural reform and labour market reform are clearly aimed at influencing the supply side of the economy and productivity, and can be understood within the framework of RBC theory. RBC models have developed rapidly recently, yet there remains a good deal of misunderstanding about the methods and aims of these models. In this article we present a review of the literature and examine a simple model, using graphical techniques, to clarify some issues. We also argue that these models, while having limitations, have caused a fruitful re-examination of supply issues in economics, after the almost exclusive focus on aggregate demand in macroeconomics until the late 1970s.  相似文献   

18.
We examine whether the Phelps–Koopmans theorem is valid in models with nonconvex production technologies. We argue that a nonstationary path that converges to a capital stock above the smallest golden rule may indeed be efficient. This finding has the important implication that “capital overaccumulation” need not always imply inefficiency. Under mild regularity and smoothness assumptions, we provide an almost-complete characterization of situations in which every path with limit in excess of the smallest golden rule must be inefficient, so that a version of the Phelps–Koopmans theorem can be recovered. Finally, we establish that a nonconvergent path with limiting capital stocks above (and bounded away from) the smallest golden rule can be efficient, even if the model admits a unique golden rule. Thus the Phelps–Koopmans theorem in its general form fails to be valid, and we argue that this failure is robust across nonconvex models of growth.  相似文献   

19.
Factor-analytic models can substantially improve the measurement of comparative legal systems and thereby our understanding of how legal systems influence economic outcomes. These methods yield better estimates of latent constructs, allow us to evaluate whether institutional features are representative of a theoretical construct and whether allegedly distinct theoretical constructs can be separated empirically. We illustrate these points through a re-analysis of a 2003 study by Djankov, La Porta, Lopez-de-Silanes and Shleifer, using a factor-analytic method that combines continuous and categorical indicators. Our results strengthen these authors' findings with respect to how legal formalism relates to legal origin and the quality of the legal system. Yet, the results also show that many of the original index items are not significantly positively related to formalism. The results thus shed light on what institutional features should be prioritized for reform if we seek to make legal systems less formalistic. Moreover, we question the evidence that the formalism model better predicts the quality of the legal system than does the alternative “incentives” model. We argue, instead, that formalism and incentives may both relate to the tendency of a legal system to use bureaucratic rule-making. Our approach can readily be applied to the analysis of legal concepts other than formalism. Journal of Comparative Economics 35 (4) (2007) 711–728.  相似文献   

20.
Summary A semiorder can be thought of as a binary relationP for which there is a utilityu representing it in the following sense:xPy iffu(x) –u(y) > 1. We argue that weak orders (for which indifference is transitive) can not be considered a successful approximation of semiorders; for instance, a utility function representing a semiorder in the manner mentioned above is almost unique, i.e. cardinal and not only ordinal. In this paper we deal with semiorders on a product space and their relation to given semiorders on the original spaces. Following the intuition of Rubinstein we find surprising results: with the appropriate framework, it turns out that a Savage-type expected utility requires significantly weaker axioms than it does in the context of weak orders.We wish to thank Tatsuro Ichiishi, Jorge Nieto, Ariel Rubinstein, Efraim Sadka and especially David Schmeidler and anonymous referees for stimulating discussions and comments. I. Gilboa received partial financial support from NSF grants nos. IRI-8814672 and SES-9113108, as well as from the Alfred P. Sloan Foundation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号