首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Sanjaya Acharya   《Economic Systems》2010,34(4):413-436
This paper measures the potential impacts of the devaluation of domestic currency of the small, developing, landlocked and transition South Asian economy of Nepal, which is lagging behind in policy studies. The impacts on growth, distribution, price changes in factor and product markets, and on selected macroeconomic features are measured. Using a computable general equilibrium model applied to social accounting matrix data, we conclude that devaluation is expansionary but mostly benefits the rich, thus leading to a more uneven income distribution. In general, the expansion of economic activities occurs in agricultural and industrial sectors, whereas services activities contract. However, when the rate of devaluation is high, the agricultural sector also starts contracting. To this typical developing economy, devaluation causes an improvement in saving investment and export/import ratios, whereas the budget deficit widens.  相似文献   

2.
Predicting a recovery from a crisis is always difficult, but it is particularly so with the 2008 crisis in the United States. How could a small segment of the financial markets known as subprime credit bring down the world’s largest economy into the worst recession since WWII? The resulting conflicts in policy responses are so severe that the short-term objective (recovery) clashes with the longer-term and more structural goals (governance, regulations, technology). This and the enormous uncertainties caused by it add to the difficulties to predict the pace of recovery. While the economic turnaround depends on consumers’ decision to spend and business’ decision to invest and hire, in an uncertain situation such decisions can only be taken as a result of market players’ perceptions of opportunity that depend on their emotional state and confidence. When the latter produces spontaneous urge to action (‘animal spirits’), the recovery process accelerates. Thus, the appropriate model to predict recovery should be able to incorporate such perceptions factors. By identifying and prioritizing economic and policy factors, it is shown how such a model, the Analytic Network Process (ANP), can be used to make the prediction of the recovery time of the US economy. The forecast was made during Spring 2009 by the author working with participants in a seminar of “Economics of Financial Crisis” at Cornell University. We used an expert judgment approach within the framework of a decision theory model, based on the ANP structure that captures the interplay between financial market, housing sector, and market confidence, all of which are influenced by a range of policies. It is estimated that a real sustainable recovery will begin around late July or early August 2010. While a quicker recovery is possible given the enormous size of fiscal stimulus, monetary injection and unprecedented measures of qualitative easing, it is our conjecture that the temporary nature of all these measures will make such a quick turn-around unsustainable (a double-dip recession). When sensitivity analysis was performed, it was found that altering the priorities of the policies, and their interactions with the aggregate demand components, would not significantly change the estimated time to recovery. This stability of the prediction is due to the overriding importance of restoring confidence, making the other factors less important.  相似文献   

3.
This brief article first investigates key dimensions underlying the progress realized by data envelopment analysis (DEA) methodologies. The resulting perspective is then used to encourage reflection on future paths for the field. Borrowing from the social sciences literature, we distinguish between problematization and gap identification in suggesting strategies to push the DEA research envelope. Emerging evidence of a declining number of influential methodological (theory)-based publications, and a flattening diffusion of applications imply an unfolding maturity of the field. Such findings suggest that focusing on known limitations of DEA, and/or of its applications, while searching for synergistic partnerships with other methodologies, can create new and fertile grounds for research. Possible future directions might thus include ‘DEA in practice’, ‘opening the black-box of production,’ ‘rationalizing inefficiency,’ and ‘the productivity dilemma.’ What we are therefore proposing is a strengthening of the methodology's contribution to fields of endeavor both including, and beyond, those considered in the past.  相似文献   

4.
Managing risk has been widely acknowledged as a crucial managerial task in the development of new technology. More recently, the acceptance of new technologies has increasingly been influenced by secondary stakeholders, some of which are difficult to identify, or whose concerns are not easily reconciled. This paper develops a conceptual framework based on the management of technology and research & development literature, stakeholder theory, risk and social judgment to describe how traditional approaches based on reducing uncertainties through estimating probabilities may not work for social uncertainties; different heuristics are needed to understand and resolve such heterogeneous stakeholder perspectives. We contribute to the discourse by describing how risk perceptions among stakeholders vary, and how this may change over time. The framework suggests that the perception of primary stakeholder towards a specific innovation is ‘Standard’ when information is well known, but becomes riskier when information is unclear. For secondary stakeholders, when there is a low degree of imperfect information, the stakeholder relationship is an ‘Irritant’ but becomes increasingly ‘Dangerous’ when information becomes ambiguous. We conclude with implications for management and future research.  相似文献   

5.
Trade liberalisation and endogenous growth: Some evidence for Turkey   总被引:1,自引:0,他引:1  
This paper examines the impact of trade liberalisation on the long-run economic development as measured by the real GDP per capita in Turkey. Based on the endogenous growth theory, we employ bivariate and multivariate cointegration analyses to test the long-run relationship among the relevant variables. Results for Turkey suggest a stable, joint long-run relationship among real GDP per capita, an index of trade liberalisation, human and physical capital in accordance with the endogenous growth theory. Statistically significant error-correction terms provide further evidence that those variables are indeed cointegrated. This also implies causal effects.  相似文献   

6.
Nine macroeconomic variables are forecast in a real-time scenario using a variety of flexible specification, fixed specification, linear, and nonlinear econometric models. All models are allowed to evolve through time, and our analysis focuses on model selection and performance. In the context of real-time forecasts, flexible specification models (including linear autoregressive models with exogenous variables and nonlinear artificial neural networks) appear to offer a useful and viable alternative to less flexible fixed specification linear models for a subset of the economic variables which we examine, particularly at forecast horizons greater than 1-step ahead. We speculate that one reason for this result is that the economy is evolving (rather slowly) over time. This feature cannot easily be captured by fixed specification linear models, however, and manifests itself in the form of evolving coefficient estimates. We also provide additional evidence supporting the claim that models which ‘win’ based on one model selection criterion (say a squared error measure) do not necessarily win when an alternative selection criterion is used (say a confusion rate measure), thus highlighting the importance of the particular cost function which is used by forecasters and ‘end-users’ to evaluate their models. A wide variety of different model selection criteria and statistical tests are used to illustrate our findings.  相似文献   

7.
This paper develops a computable general equilibrium (CGE) model of the transition from a central planned economy to a market economy. The model is an extension of Wellisz and Findlay's (1986) model of the Soviet second economy. By distinguishing alternative assumptions about the disposition of the government budget, two model variants — the activist and non-activist — are analyzed. Equilibria of these model variants are computed for various parameter specifications of the Kantorovich ray, which represents the stringency of central planners' direction of the economy. The paper shows that increasing efficiency of the private sector, while it reduces the size of government subsidies to the state sector, does not necessarily increase the net government budget.  相似文献   

8.
We evaluate the Smets-Wouters New Keynesian model of the US postwar period, using indirect inference, the bootstrap and a VAR representation of the data. We find that the model is strongly rejected. While an alternative (New Classical) version of the model fares no better, adding limited nominal rigidity to it produces a ‘weighted’ model version closest to the data. But on data from 1984 onwards - the ‘great moderation’ - the best model version is one with a high degree of nominal rigidity, close to New Keynesian. Our results are robust to a variety of methodological and numerical issues.  相似文献   

9.
Using data from a large, U.S. federal job training program, we investigate whether enrolment incentives that exogenously vary the ‘shadow prices’ for serving different demographic subgroups of clients influence case workers' intake decisions. We show that case workers enroll more clients from subgroups whose shadow prices increase but select at the margin weaker-performing members from those subgroups. We conclude that enrolment incentives curb cream-skimming across subgroups leaving a residual potential for cream-skimming within a subgroup.  相似文献   

10.
In many econometric models the asymptotic variance of a parameter estimate depends on the value of another structural parameter in such a way that the data contain little information about the former when the latter is close to a critical value. This paper introduces the zero-information-limit-condition (ZILC) to identify such models where ‘weak identification’ leads to spurious inference. We find that standard errors tend to be underestimated in these cases, but the size of the asymptotic t-test may either be too great (the intuitive case emphasized in the ‘weak instrument’ literature) or too small as in two cases illustrated here.  相似文献   

11.
Bayesian approaches to the estimation of DSGE models are becoming increasingly popular. Prior knowledge is normally formalized either directly on deep parameters' values (‘microprior’) or indirectly, on macroeconomic indicators, e.g. moments of observable variables (‘macroprior’). We introduce a non-parametric macroprior which is elicited from impulse response functions and assess its performance in shaping posterior estimates. We find that using a macroprior can lead to substantially different posterior estimates. We probe into the details of our result, showing that model misspecification is likely to be responsible of that. In addition, we assess to what extent the use of macropriors is impaired by the need of calibrating some hyperparameters.  相似文献   

12.
When location shifts occur, cointegration-based equilibrium-correction models (EqCMs) face forecasting problems. We consider alleviating such forecast failure by updating, intercept corrections, differencing, and estimating the future progress of an ‘internal’ break. Updating leads to a loss of cointegration when an EqCM suffers an equilibrium-mean shift, but helps when collinearities are changed by an ‘external’ break with the EqCM staying constant. Both mechanistic corrections help compared to retaining a pre-break estimated model, but an estimated model of the break process could outperform. We apply the approaches to EqCMs for UK M1, compared with updating a learning function as the break evolves.  相似文献   

13.
Managerial efficiency is as important in social profit enterprises (SPEs) as it is for more traditional financial-profit organizations. In this regard, both donors and SPE executives use efficiency information in making decisions. Here, we suggest a linked, two-stage Data Envelopment Analysis (DEA) methodology for assessing efficiency in both charitable fundraising and cause delivery, while empirically investigating results for international aid organizations. The model allows efficiency assessment for both the fundraising and utilization of generated funds when directed for cause-related purposes. This, in particular, allows for measurement of the organization’s managerial efficiency relative to both multiple phased goals and peer organizations. Additionally, the approach provides benchmarks for identifying sources of improved performance in fundraising and program/cause service delivery. It can also project the results of changes in inputs on the amount of resources available for the charitable organization’s cause.The proposed model(s) allow the examiner to assess performance while, at the same time, identifying those instances wherein the simple ratio measures commonly used in non-profit assessment are (1) deficient, and/or (2) misleading because of the use of ‘incorrect’ variables, or the ‘hiding’ of inefficiency if/when tax form categories are filed by an SPE. Importantly, the suggested two-stage DEA methodology can be useful for any organization with multiple-linked goals.  相似文献   

14.
This paper examines the usefulness of a more refined business cycle classification for monthly industrial production (IP), beyond the usual distinction between expansions and contractions. Univariate Markov-switching models show that a three regime model is more appropriate than a model with only two regimes. Interestingly, the third regime captures ‘severe recessions’, contrasting the conventional view that the additional third regime represents a ‘recovery’ phase. This is confirmed by means of Markov-switching vector autoregressive models that allow for phase shifts between the cyclical regimes of IP and the Conference Board's Leading Economic Index (LEI). The timing of the severe recession regime mostly corresponds with periods of substantial financial market distress and severe credit squeezes, providing empirical evidence for the ‘financial accelerator’ theory.  相似文献   

15.
This paper proposes a nonlinear panel data model which can endogenously generate both ‘weak’ and ‘strong’ cross-sectional dependence. The model’s distinguishing characteristic is that a given agent’s behaviour is influenced by an aggregation of the views or actions of those around them. The model allows for considerable flexibility in terms of the genesis of this herding or clustering type behaviour. At an econometric level, the model is shown to nest various extant dynamic panel data models. These include panel AR models, spatial models, which accommodate weak dependence only, and panel models where cross-sectional averages or factors exogenously generate strong, but not weak, cross sectional dependence. An important implication is that the appropriate model for the aggregate series becomes intrinsically nonlinear, due to the clustering behaviour, and thus requires the disaggregates to be simultaneously considered with the aggregate. We provide the associated asymptotic theory for estimation and inference. This is supplemented with Monte Carlo studies and two empirical applications which indicate the utility of our proposed model as a vehicle to model different types of cross-sectional dependence.  相似文献   

16.
The traditional Becker/Arrow model of taste discrimination in pay depicts majority and minority labour as perfectly substitutable, implying that all workers perform precisely the same job assignment and have the same qualifications. The model is thus only appropriate for determining whether ceteris paribus pay differences between white workers and non-white workers, for example, performing job assignment A are attributable to prejudice (‘within-assignment discrimination’). The model is inappropriate for determining whether ceteris paribus pay differences between white workers in assignment A and non-white workers in assignment B reflect prejudice (‘cross-assignment discrimination’). We extend the traditional model to allow for cross-assignment discrimination and we propose an empirical methodology for its estimation. In so doing we address two broad questions: (1) Do predictions about cross-assignment discrimination vary with the form of the production function?; and (2) How can one estimate such discrimination when there is no common measure of productivity? We address the first question by deriving a measure of cross-assignment discrimination for four different production functions—Generalized Leontief, Quadratic, CES, and Cobb-Douglas. The Generalized Leontief provides the most general results, although closed form solutions are not possible. Closed form solutions are obtainable from the other three functions, but only under restrictive assumptions. There are two main findings. First, most predictions are generally robust across functional forms. Second, cross-assignment discrimination depends upon productivity and labour supply differences between the two worker groups, labour market structure, and the interaction between relative group productivity and prejudice. We address the second question by outlining, for future exploration, a two-stage regression methodology in which a standardised (i.e. common) measure of productivity is estimated separately for each occupation. This measure is then incorporated as a right-hand-side explanatory variable in a second-stage, all-occupation regression designed to estimate cross-assignment discrimination. We discuss the proposed methodology with reference to a valuable and interesting test case: The market for professional sports players.  相似文献   

17.
We consider naming and categorization practises within the information technology (IT) arena. In particular, with how certain terminologies are able to colonise wide areas of activity and endure for relatively long periods of time, despite the diversity and incremental evolution of individual technical instances. This raises the question as to who decides whether or not a particular vendor technology is part of a product category. Who decides the boundaries around a technology nomenclature? Existing Information Systems scholarship has tended to present terminologies as shaped by wide communities of players but this does not capture how particular kinds of knowledge institutions have emerged in recent year to police the confines of technological fields. The paper follows the work of one such group of experts—the industry analyst firm Gartner Inc.—and discusses their current and past role in the evolution of Customer Relationship Management (CRM) software. We show how they make regular (but not always successful) ‘naming interventions’ within the IT domain and how they attempt to regulate the boundaries that they and others have created through episodes of ‘categorisation work’. These experts not only attempt to exercise control over a terminology but also the interpretation of that name. Our arguments are informed by ethnographic observations carried out on the eve of the contemporary CRM boom and interviews conducted more recently as part of an ongoing investigation into industry analysts. The paper bridges a number of disparate bodies of literature from Information Systems, Economic Sociology, the Sociology of Scientific Knowledge, and Science and Technology Studies.  相似文献   

18.
We derive an inter-temporal theory of choice, in the spirit of Kreps and Porteus [Kreps, D.M., Porteus, E.L., 1978. Temporal resolution of uncertainty and dynamic choice theory. Econometrica 46, 185–200], where decision makers have incomplete preferences. This can be used to model indecisiveness as well as unforeseen contingencies. The key to our approach is a time consistency condition and therefore the normative connection between ex-ante and ex-post choice. The time consistency condition enables a representation that is a straight forward extension of recursive utility with the exception that it features an inter-temporal ‘utility for flexibility’.  相似文献   

19.
This paper is concerned with estimating preference functionals for choice under risk from the choice behaviour of individuals. We note that there is heterogeneity in behaviour between individuals and within individuals. By ‘heterogeneity between individuals’ we mean that people are different, in terms of both their preference functionals and their parameters for these functionals. By ‘heterogeneity within individuals’ we mean that the behaviour may be different even by the same individual for the same choice problem. We propose methods of taking into account all forms of heterogeneity, concentrating particularly on using a Mixture Model to capture the heterogeneity of preference functionals.  相似文献   

20.
This paper evaluates the impact of service sector trade liberalization on the world economy by a ten-region, eleven-sector CGE model with import embodied technology transfer from developed countries to developing countries. Simulation results show that service sector trade liberalization not only directly affects world service production and trade, but also has significant implications for other sectors in the economy. The major channel of the impact is through inter-industry input-output relations and TFP growth induced from services imported by developing countries from developed countries, which may be embodied with new information and advanced technology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号