首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This symposium creates and stimulates new dialogue and cross‐disciplinary exchange between planning theorists and geographers in researching the transfer of urban policy and planning models, ideas and techniques. The symposium challenges a restricted historical focus in much of the emerging geographical literature on urban policy mobilities by drawing on a rich tradition within planning history of exploring and documenting the trans‐urban travel of planning ideas and models over the last 150 years. It is argued that this longer‐term perspective is required to highlight important historical continuities and institutional legacies to contemporary urban policy circuits and pathways and to question what is particularly new, distinct and innovative about an intensification in the travel of urban ideas, plans and policies over the past decade — and the accompanying scholarly interest in them. The symposium also uses the emphasis on particular details and specific experiences within planning histories to foreground and develop approaches, particularly from recent geographical scholarship, that investigate the contingent and embodied practices and wider epistemic contexts that enable — or hinder — contemporary policy transfer.  相似文献   

2.
3.
Abstract This paper reviews the marketing, transportation and environmental economics literature on the joint estimation of revealed preference (RP) and stated preference (SP) data. The RP and SP approaches are first described with a focus on the strengths and weaknesses of each. Recognizing these strengths and weaknesses, the potential gains from combining data are described. A classification system for combined data that emphasizes the type of data combination and the econometric models used is proposed. A methodological review of the literature is pursued based on this classification system. Examples from the environmental economics literature are highlighted. A discussion of the advantages and disadvantages of each type of jointly estimated model is then presented. Suggestions for future research, in particular opportunities for application of these models to environmental quality valuation, are presented.  相似文献   

4.
This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions.
The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches.
This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.  相似文献   

5.
In this article, we merge two strands from the recent econometric literature. First, factor models based on large sets of macroeconomic variables for forecasting, which have generally proven useful for forecasting. However, there is some disagreement in the literature as to the appropriate method. Second, forecast methods based on mixed‐frequency data sampling (MIDAS). This regression technique can take into account unbalanced datasets that emerge from publication lags of high‐ and low‐frequency indicators, a problem practitioner have to cope with in real time. In this article, we introduce Factor MIDAS, an approach for nowcasting and forecasting low‐frequency variables like gross domestic product (GDP) exploiting information in a large set of higher‐frequency indicators. We consider three alternative MIDAS approaches (basic, smoothed and unrestricted) that provide harmonized projection methods that allow for a comparison of the alternative factor estimation methods with respect to nowcasting and forecasting. Common to all the factor estimation methods employed here is that they can handle unbalanced datasets, as typically faced in real‐time forecast applications owing to publication lags. In particular, we focus on variants of static and dynamic principal components as well as Kalman filter estimates in state‐space factor models. As an empirical illustration of the technique, we use a large monthly dataset of the German economy to nowcast and forecast quarterly GDP growth. We find that the factor estimation methods do not differ substantially, whereas the most parsimonious MIDAS projection performs best overall. Finally, quarterly models are in general outperformed by the Factor MIDAS models, which confirms the usefulness of the mixed‐frequency techniques that can exploit timely information from business cycle indicators.  相似文献   

6.
Abstract Does housework reduce the market wage, and if so, does it have a similar impact for males and females? In this paper, we survey and evaluate the recent and growing empirical literature on the linkages between housework and the wage rate. The review is motivated by unexplained gender wage gaps across studies, which consider personal and market‐related factors. We focus on this less‐studied aspect of wage determination. We consider the required modelling framework, and provide standardized estimated effects of housework on the hourly wage across studies. We evaluate how this literature has addressed potential estimation problems, in particular, the endogeneity of housework, concavity of the housework–wage function, threshold effects and work effort effects. We conclude that the evidence across ordinary least squares, instrumental variable, fixed effects and two‐stage least squares results casts serious doubt on the idea that the negative female housework–wage relationship is only driven by endogeneity bias or individual‐specific characteristics. Yet, much more needs to be done to address modelling and data requirements, and we point out likely and promising future research directions.  相似文献   

7.
In this paper, we assess the possibility of producing unbiased forecasts for fiscal variables in the Euro area by comparing a set of procedures that rely on different information sets and econometric techniques. In particular, we consider autoregressive moving average models, Vector autoregressions, small‐scale semistructural models at the national and Euro area level, institutional forecasts (Organization for Economic Co‐operation and Development), and pooling. Our small‐scale models are characterized by the joint modelling of fiscal and monetary policy using simple rules, combined with equations for the evolution of all the relevant fundamentals for the Maastricht Treaty and the Stability and Growth Pact. We rank models on the basis of their forecasting performance using the mean square and mean absolute error criteria at different horizons. Overall, simple time‐series methods and pooling work well and are able to deliver unbiased forecasts, or slightly upward‐biased forecast for the debt–GDP dynamics. This result is mostly due to the short sample available, the robustness of simple methods to structural breaks, and to the difficulty of modelling the joint behaviour of several variables in a period of substantial institutional and economic changes. A bootstrap experiment highlights that, even when the data are generated using the estimated small‐scale multi‐country model, simple time‐series models can produce more accurate forecasts, because of their parsimonious specification.  相似文献   

8.
Airline traffic forecasting is important to airlines and regulatory authorities. This paper examines a number of approaches to forecasting short- to medium-term air traffic flows. It contributes as a rare replication, testing a variety of alternative modelling approaches. The econometric models employed include autoregressive distributed lag (ADL) models, time-varying parameter (TVP) models and an automatic method for econometric model specification. A vector autoregressive (VAR) model and various univariate alternatives are also included to deliver unconditional forecast comparisons. Various approaches for taking into account interactions between contemporaneous air traffic flows are examined, including pooled ADL models and the enhanced models with the addition of a “world trade” variable. Based on the analysis of a number of forecasting error measures, it is concluded that pooled ADL models that include the “world trade” variable outperform the alternatives, and in particular univariate methods; and, second, that automatic modelling procedures are enhanced through judgmental intervention. In contrast to earlier results, the TVP models do not improve accuracy. Depending on the preferred error measure, the difference in accuracy may be substantial.  相似文献   

9.
Abstract This paper presents an overview of various models of regional growth that have appeared in the literature in the last 40 years. It considers the past, and therefore supply‐side models, such as the standard neoclassical, juxtaposed against essentially demand‐side approaches such as the export‐base and cumulative causation models (as integrated into the Kaldorian approach); before moving on to the ‘present’ and more recent versions of the neoclassical model involving spatial weights and ‘convergence clubs’, as well as new economic geography core–periphery models, and the ‘innovation systems’ approach. A key feature of the more recent literature is an attempt to explicitly include spatial factors into the model, and thus there is a renewed emphasis on agglomeration economies and spillovers. Discussing ‘present’ and ‘future’ approaches to regional growth overlaps with the current emphasis in the literature on the importance of more intangible factors such as the role of ‘knowledge’ and its influence on growth. Finally, there is a discussion of the greater emphasis that needs to be placed at the ‘micro‐level’ when considering what drives growth, and thus factors such as inter alia firm heterogeneity, entrepreneurship and absorptive capacity.  相似文献   

10.
11.
General‐to‐Specific (GETS) modelling has witnessed major advances thanks to the automation of multi‐path GETS specification search. However, the estimation complexity associated with financial models constitutes an obstacle to automated multi‐path GETS modelling in finance. Making use of a recent result we provide and study simple but general and flexible methods that automate financial multi‐path GETS modelling. Starting from a general model where the mean specification can contain autoregressive terms and explanatory variables, and where the exponential volatility specification can include log‐ARCH terms, asymmetry terms, volatility proxies and other explanatory variables, the algorithm we propose returns parsimonious mean and volatility specifications.  相似文献   

12.
Abstract The traditional economics of innovation, inspired by Schumpeter and more recent advances on his work, seem unable to explain why firms with similar external conditions may show greatly different performance in innovation. Contrastingly, the literature on corporate governance provides some useful insights for understanding corporate innovation activity, to the extent that such literature examines the economic effects of different modes of coordination between firm members. The process through which individuals integrate their human and physical resources within the firm is central to the dynamics of corporate innovation. This paper provides the first survey of the literature on this issue. We start by discussing how various theoretical approaches to the analysis of the firm deal with technological innovation. We then describe three main channels – corporate ownership, corporate finance and labour – through which a system of corporate governance shapes a firm's innovation activity. Finally, we examine the relationship between country‐level institutional settings, national patterns of corporate governance and the aggregate innovation activity of corporations. We conclude by suggesting that future research should focus more deeply on the interrelation between the various dimensions of corporate governance and on their joint effect on firm innovation.  相似文献   

13.
We analyse the finite sample properties of maximum likelihood estimators for dynamic panel data models. In particular, we consider transformed maximum likelihood (TML) and random effects maximum likelihood (RML) estimation. We show that TML and RML estimators are solutions to a cubic first‐order condition in the autoregressive parameter. Furthermore, in finite samples both likelihood estimators might lead to a negative estimate of the variance of the individual‐specific effects. We consider different approaches taking into account the non‐negativity restriction for the variance. We show that these approaches may lead to a solution different from the unique global unconstrained maximum. In an extensive Monte Carlo study we find that this issue is non‐negligible for small values of T and that different approaches might lead to different finite sample properties. Furthermore, we find that the Likelihood Ratio statistic provides size control in small samples, albeit with low power due to the flatness of the log‐likelihood function. We illustrate these issues modelling US state level unemployment dynamics.  相似文献   

14.
Asset Pricing with Observable Stochastic Discount Factors   总被引:2,自引:0,他引:2  
The stochastic discount factor model provides a general framework for pricing assets. By specifying the discount factor suitably it encompasses most of the theories currently in use, including CAPM and consumption CAPM. The SDF model has been based on the use of single and multiple factors, and on latent and observed factors. In most situations, and especially for the term structure, single factor models are inappropriate, whilst latent variables require the somewhat arbitrary specification of generating processes and are difficult to interpret. In this paper we survey the principal different implementations of the SDF model for bonds, equity and FOREX and propose a new approach. This is based on the use of multiple factors that are observable and modelling the joint distribution of excess returns and the factors using a multi–variate GARCH–in–mean process. We argue that in general single equation and VAR models, although widely used in empirical finance, are inappropriate as they do not satisfy the no–arbitrage condition. Since risk premia arise from conditional covariation between the returns and the factors, both a multi–variate context and having conditional covariances in the conditional mean process, is essential. We explain how apparent exceptions, such as the CIR and Vasicek models, in fact meet this requirement — but at a price. We explain our new approach, discuss how it might be implemented and present some empirical evidence, mainly from our own researches. Partly, to enable comparisons to be made, the survey also includes evidence from recent empirical work using more traditional approaches.  相似文献   

15.
The population characteristics observed by selecting a complex sample from a finite identified population are the result of at least two processes: the process which generates the values attached to the units in the finite population, and the process of selecting the sample of units from the population. In this paper we propose that the resulting observations by viewed as the joint realization of both processes. We overcome the inherent difflculty in modelling the joint processes of generation and selection by exploring second moment and other simplifying assumptions. We obtain general expressions for the mean and covariance function of the joint processes and show that several overdispersion models discussed in the literature for the analysis of complex surveys are a direct consequence of our formulation, undere particular sampling schemes and population structures.  相似文献   

16.
COINTEGRATION AND DYNAMIC TIME SERIES MODELS   总被引:2,自引:0,他引:2  
ABSTRACT. This paper provides a survey of some of the recent developments in the field of econometric modelling with cointegrated time series. In particular, we describe the testing and estimation procedures which have become increasingly popular in the recent applied literature. In addition to the 'two-stage' procedure proposed by Engle and Granger, we consider extensions to the modelling of dynamic models with cointegrated variables, such as the estimation of models with multiple cointegration vectors, simultaneous systems, models with seasonally integrated and cointegrated variables. Furthermore, we illustrate the practical application of the techniques describes in the paper by means of a tutorial data set.  相似文献   

17.
Modeling the correlation structure of returns is essential in many financial applications. Considerable evidence from empirical studies has shown that the correlation among asset returns is not stable over time. A recent development in the multivariate stochastic volatility literature is the application of inverse Wishart processes to characterize the evolution of return correlation matrices. Within the inverse Wishart multivariate stochastic volatility framework, we propose a flexible correlated latent factor model to achieve dimension reduction and capture the stylized fact of ‘correlation breakdown’ simultaneously. The parameter estimation is based on existing Markov chain Monte Carlo methods. We illustrate the proposed model with several empirical studies. In particular, we use high‐dimensional stock return data to compare our model with competing models based on multiple performance metrics and tests. The results show that the proposed model not only describes historic stylized facts reasonably but also provides the best overall performance.  相似文献   

18.
Rational expectations modelling has been criticized for assuming that economic agents can learn quickly about and compute rational price expectations. In response, various authors have studied theoretical models in which economic agents use adaptive statistical rules to develop price expectations. A goal of this literature has been to compare resulting learning equilibria with rational expectations equilibria. The lack of empirical analysis in this literature suggests that adaptive learning makes otherwise linear dynamic models nonlinearly intractable for current econometric technology. In response to the lack of empirical work in this literature, this paper applies to post-1989 monthly data for Poland a new method for modelling learning about price expectations. The key idea of the method is to modify Cagan’s backward-looking adaptive-expectations hypothesis about the way expectations are actually updated to a forward-looking characterization which instead specifies the result of learning. It says that, whatever the details of how learning actually takes places, price expectations are expected to converge geometrically to rationality. The method is tractable because it involves linear dynamics. The paper contributes substantively by analyzing the recent Polish inflation, theoretically by characterizing learning, and econometrically by using learning as a restriction for identifying (i.e., estimating wth finite variance) unobserved price expectations with the Kalman filter. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

19.
We survey recent empirical evidence on monetary policy rules, and find that the emphasis in the political economy literature on institutional design (e.g. central bank independence and inflation targeting) is exaggerated. Formal institutional reform seems neither a necessary nor a sufficient condition for the observation of shifts in monetary policy rules. However, there is no doubt that in some cases (e.g. the UK following the start of inflation targeting in 1992, and Bank of England Independence in 1997), a major shift in monetary policy conduct is detectable. We also highlight the problems in explicitly testing the predictions of the political economy literature. Semi-structural modelling approaches, such as time-varying VAR models may be more useful in understanding policy rules, and the interaction between policy shifts and changes in the transmission mechanism.  相似文献   

20.
The current crisis and discussions, in the euro area in particular, show that sovereign debt crises/defaults are no longer confined to developing economies. Following crises in many Latin American countries, the literature on quantitative dynamic macro models of sovereign default has been advancing rapidly. Current debate should take note of the findings of this literature – an extensive overview of which has been provided in this paper. This paper also discusses the inherent difficulties as well as possibilities of integrating this type of model into standard business cycle models (RBC and DSGE models). This is likely to be particularly helpful when using models to analyse upcoming issues in the euro area, such as a suitable sovereign insolvency law or the assumption of joint liability.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号