首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We adapt the Bierens (1990) test to the I-regular models of Park and Phillips (2001). Bierens (1990) defines the test hypothesis in terms of a conditional moment condition. Under the null hypothesis, the moment condition holds with probability one. The probability measure used is that induced by the variables in the model, that are assumed to be strictly stationary. Our framework is nonstationary and this approach is not always applicable. We show that the Lebesgue measure can be used instead in a meaningful way. The resultant test is consistent against all I-regular alternatives.  相似文献   

2.
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements.  相似文献   

3.
With the expansion of urbanization caused by the growth of population and industrial activities, the urban/city and suburban areas are facing a variety of environmental threats. Although more research and urban policy has advocated and practiced the development of green infrastructure (GI) to support urban sustainable environment, the evaluation framework for the development of GI for promoting environmental sustainability is still insufficient. Moreover, the Analytic Hierarchy Process (AHP) commonly applied in published literature, makes an unrealistic assumption of independent relationships among dimensions/criteria in decision making for satisfying the real-world problem. Therefore, the purpose of this study is to construct the evaluation framework, including four dimensions and related ten criteria, using a new hybrid-modified multiple attribute decision-making (MADM) model for developing and improving the GI for promoting environmental sustainability. This MADM model is combined with three different methodologies of MADM, including the Decision-Making Trial and Evaluation Laboratory (DEMATEL) for constructing the influential network relation map (INRM) to explore the complex influential inter-relationships and DEMATEL based on Analytic Network Process (DANP) for determining the influential weights with the VIse Kriterijumska Optimizacija I Kompromisno Resenje (VIKOR) for evaluating and presenting improvement strategies for six different GIs. The empirical study indicates that DEMATEL and DANP Results suggest that decision-makers should pay more attention to the improvement of Design (D4) and Materials (D2) in terms of dimensions when utilizing the GI to promote environmental sustainability. Because these dimensions are enhanced, Species (D1) and Energy (D3) will be improved in synchronization. From the perspective of criteria, five are key core criteria and need to be focused on first: increasing the green coverage rate (B9), utilizing sustainable materials (B4), using ecological engineering (B8), shaping species biodiversity (B1), and reducing energy consumption (B5). Modified VIKOR reveals that “grass swales” are a comparatively better choice among six GIs for promoting environmental sustainability toward achieving the aspiration level. Therefore, this MADM model is beneficial to provide a more convincing assessment framework and improvement strategies for the development of GI for promoting environmental sustainability. As a result, these modified MADM models can be shown more conveniently and reasonably than traditional methods such as traditional AHP or ANP method.  相似文献   

4.
After the self-criticism of Richard Posner (A failure of capitalism. The crisis of ’08 and the descend to depression. Harvard University Press, Cambridge, 2009a and some later papers), a position shared in part by Gary Becker, it is right to ask what exactly remains of the Chicago School, which came into being in the early 1930s with Knight and made a name for itself in the 1980s as the new orthodoxy with Lucas (Reh) and Fama (Emh). The financial crisis has, in this sense, cast the debate between the Saltwater school (the American universities on the coast) and the Freshwater school (the universities near the Great Lakes, including Chicago) in a new light. This article traces the development of the Chicago School, the consolidation of conservative think-tanks (especially the Mont Pelerin Society whose members have included, among others, von Hayek and Friedman) and the more recent positions of the School, including some that are less organic (Diamond, Kashyap, Rajan, Zingales). At the centre of the debate is the question of the failures of the State vs the failures of the market and the role of institutions. On a methodological level the relationship between law and economics (L&E) is also in discussion, which leads to an interesting comparison between Chicago (again Posner, past and present) and Yale (Calabresi). After having described the echoes of the debate across the Atlantic (with particular reference to Italy), it will be asked if the anomalies that can be found in the consolidated paradigm (Reh, Emh, but also L&E) will lead to the abandonment of pre-constituted models in favour of a less rigid theoretical framework (Ike together with Keynes and the Chicagoan Knight, but also to a certain extent, von Hayek) or simply a pause for reflection.  相似文献   

5.
The distinction between peace and conflict in contemporary international relations is no longer well-defined. Leveraging modern technology, hostile action below the threshold of war has become increasingly effective. The objective of such aggression is often the influence of opinions, emotions, and, ultimately, the decisions of a nation's citizenry. This work presents two new game theoretic frameworks, denoted as prospect games and regulated prospect games, to inform defensive policy against these threats. These frameworks respectively model (a) the interactions of competing entities influencing a populace and (b) the preemptive actions of a regulating agent to alter such a framework. Prospect games and regulated prospect games are designed to be adaptable, depending on the assumed nature of persuaders' interactions and their rationality. The contributions herein are a modeling framework for competitive influence operations under a common set of assumptions, model variants that respectively correspond to scenario-specific modifications of selected assumptions, the illustration of practical solution methods for the suite of models, and a demonstration on a representative scenario with the ultimate goal of providing a quantifiable, tractable, and rigorous framework upon which national policies defending against competitive influence can be identified.  相似文献   

6.
We develop new methods for representing the asset-pricing implications of stochastic general equilibrium models. We provide asset-pricing counterparts to impulse response functions and the resulting dynamic value decompositions (DVDs). These methods quantify the exposures of macroeconomic cash flows to shocks over alternative investment horizons and the corresponding prices or investors’ compensations. We extend the continuous-time methods developed in Hansen and Scheinkman (2012) and Borovi?ka et al. (2011) by constructing discrete-time, state-dependent, shock-exposure and shock-price elasticities as functions of the investment horizon. Our methods are applicable to economic models that are nonlinear, including models with stochastic volatility.  相似文献   

7.
We consider classes of multivariate distributions which can model skewness and are closed under orthogonal transformations. We review two classes of such distributions proposed in the literature and focus our attention on a particular, yet quite flexible, subclass of one of these classes. Members of this subclass are defined by affine transformations of univariate (skewed) distributions that ensure the existence of a set of coordinate axes along which there is independence and the marginals are known analytically. The choice of an appropriate m-dimensional skewed distribution is then restricted to the simpler problem of choosing m univariate skewed distributions. We introduce a Bayesian model comparison setup for selection of these univariate skewed distributions. The analysis does not rely on the existence of moments (allowing for any tail behaviour) and uses equivalent priors on the common characteristics of the different models. Finally, we apply this framework to multi-output stochastic frontiers using data from Dutch dairy farms.  相似文献   

8.
Although the link between household size and consumption has strong empirical support, there is no consistent way in which demographics are dealt with in standard life-cycle models. We study the relationship between the predictions of the Single Agent model (the standard in the literature) versus a simple model extension (the Demographics model) where deterministic changes in household size and composition affect optimal consumption decisions. We show theoretically that the Demographics model is conceptually preferable to the Single Agent model as it captures economic mechanisms ignored by the latter. However, our quantitative analysis demonstrates that differences in predictions for consumption are negligible across models, when using standard calibration strategies. This suggests that it is largely irrelevant which model specification is used.  相似文献   

9.
Strategic human resource management addresses the need to create vertical linkages of human resource management (HRM) attributes with corporate strategy as well as horizontal linkages that integrate practices among HRM functions. Most models commonly focus on either vertical or horizontal linkages. This paper utilizes three categories of person–environment fit to create both vertical and horizontal linkages. Based on a strategic contingency framework, it demonstrates how person–environment fit relates to organizational competencies that supports corporate strategy. Furthermore, it demonstrates how person–environment fit can be used to promote internal alignment of HRM practices. Implications of this approach to strategic human resource management are then discussed.  相似文献   

10.
As the first kind of digital cryptocurrency, the Bitcoin price cycle provides an opportunity to test bubble theory in the digital currency era. Based on the existing asset bubble theory, we verified the Bitcoin bubble based on the production cost with the application of VAR and LPPL models, and this method achieved good predictive power. The following conclusions are reached: (1) PECR is constructed to depict the deviation degree between the price and production cost, while BC is used to illustrate the bubble size in the price, and both are effective measures; (2) the number of unique addresses is a suitable measure of the use value of Bitcoin, and this result has passed the Granger causality test; (3) PECR and BC are verified via the LPPL model, and the next large bubble is expected in the second half of 2020. Considering that Bitcoin will see 'output halved' in May 2020, this prediction is a high-probability event.  相似文献   

11.
We investigate financial markets under model risk caused by uncertain volatilities. To this end, we consider a financial market that features volatility uncertainty. We use the notion of G-expectation and its corresponding G-Brownian motion recently introduced by Peng (2007) to ensure a mathematically consistent framework. Our financial market consists of a riskless asset and a risky stock with price process modeled by geometric G-Brownian motion. We adapt the notion of arbitrage to this more complex situation, and consider stock price dynamics which exclude arbitrage opportunities. Volatility uncertainty results in an incomplete market. We establish the interval of no-arbitrage prices for general European contingent claims, and deduce explicit results in the Markovian case.  相似文献   

12.
A common conviction among quality practitioners is that the implementation and practice of Statistical Process Control (SPC) within the production environment would lead to improvements in output quality. How and why this conviction might be valid, on the other hand, has not been well conceptualized in the scientific domain. Furthermore, besides improved performance in terms of output quality, other benefits that might arise from the implementation and practice of SPC is a question that has yet to be adequately addressed in the literature. This article proposes a theoretical framework to provide a better conceptual understanding for these two issues. This theoretical framework extends the conceptualization and definition of a recently proposed construct, (SPC Implementation/Practice), by formally postulating direct and indirect effects for this construct. These postulated effects go beyond stating the intuitively obvious—namely, that SPC Implementation/Practice positively affects quality—(1) to specifically relate Process Quality and Product Quality to SPC Implementation/Practice and (2) to incorporate the Job Characteristics Model into an explanation of how and why SPC Implementation/Practice affects front-line operators' internal work motivation and job satisfaction levels.  相似文献   

13.
Recent work by Sarasvathy and Venkataraman (Entrepreneurship Theory and Practice, 113–135, 2011) suggest that entrepreneurship may offer an alternative to the scientific method in solving some of the increasingly complex problems facing modern society. This paper adopts an entrepreneurial method framework to explore how social enterprises (SEs) can more efficiently and effectively provide goods and services to the needy. SEs have the potential to improve efficiency and effectiveness by innovating new products, processes, strategies, and/or business models to better meet their beneficiaries’ needs. Three case studies of SEs are used to explore the process and the outcomes of the entrepreneurial method. In addition, propositions are developed to illustrate the social and economic performance advantages of SEs and offer managerial implications for enhanced practice.  相似文献   

14.
This paper introduces a general framework for the fair allocation of indivisible objects when each agent can consume at most one (e.g., houses, jobs, queuing positions) and monetary compensations are possible. This framework enables us to deal with identical objects and monotonicity of preferences in ranking objects. We show that the no-envy solution is the only solution satisfying equal treatment of equals, Maskin monotonicity, and a mild continuity property. The same axiomatization holds if the continuity property is replaced by a neutrality property.  相似文献   

15.
The problem of irreversible investment with idiosyncratic risk is studied by interpreting market incompleteness as a source of ambiguity over the appropriate no-arbitrage discount factor. The maxmin utility over multiple priors framework is used to model and solve the irreversible investment problem. Multiple priors are modeled using the notion of κ‐ignorance. This set-up is used to analyze finitely lived options. For infinitely lived options the notion of constant κ‐ignorance is introduced. For these sets of density generators the corresponding optimal stopping problem is solved for general (in-)finite horizon optimal stopping problems driven by geometric Brownian motion. It is argued that an increase in the set of priors delays investment, whereas an increase in the degree of market completeness can have a non-monotonic effect on investment.  相似文献   

16.
Exponential smoothing procedures, in particular those recommended byBrown [1962] are used extensively in many areas of economics, business and engineering. It is shown in this paper that:
  1. Brown's forecasting procedures are optimal in terms of achieving minimum mean square error forecasts only if the underlying stochastic process is included in a limited subclass of ARIMA (p, d, q) processes. Hence, it is shown what assumptions are made when using these procedures.
  2. The implication of point (i) is that the users ofBrown's procedures tacitly assume that the stochastic processes which occur in the real world are from the particular restricted subclass of ARIMA (p, d, q) processes. No reason can be found why these particular models should occur more frequently than others.
  3. It is further shown that even if a stochastic process which would lead toBrown's model occurred, the actual methods used for making the forecasts are clumsy and much simpler procedures can be employed.
  相似文献   

17.
This paper investigates the maximum horizon at which conditioning information can be shown to have value for univariate time series forecasts. In particular, we consider the problem of determining the horizon beyond which forecasts from univariate time series models of stationary processes add nothing to the forecast implicit in the unconditional mean. We refer to this as the content horizon for forecasts, and provide a formal definition of the corresponding forecast content function at horizons s=1,… S. This function depends upon parameter estimation uncertainty as well as on autocorrelation structure of the process. We show that for autoregressive processes it is possible to give an asymptotic expression for the forecast content function, and show by simulation that the expression gives a good approximation even at modest sample sizes. The results are applied to the growth rate of GDP and to inflation, using US and Canadian data.  相似文献   

18.
This paper introduces the concept of harmonic growth as an extended acceptation of the notion of development, and discusses its measurement via the Harmonic Growth Index (HGI). The growth is seen as harmonic when the behaviour of a benchmark time series, which here is a measure of wealth, such as per capita GDP, is followed by a similar pattern in socio-economic series. Unlike most widely used indicators in the literature, which take into account the measurement of development over a single time, HGI measures the degree to which a social indicator’s time series pattern matches with the GDP’s. The index is a function, ranging in [0, 1], of the coefficients of the uniform B-splines fitted to each time series, according to the functional data framework. A case study on Mediterranean welfare countries (Greece, Italy, Portugal, and Spain), in the period 1996–2007, shows critical differences in the selected indicators which can be ascribed to their dissimilar specific development models. HGI can be also considered as a general index to measure the similarity between time patterns, or as an alternative to correlation for (non-necessarily linear) time series.  相似文献   

19.
In this study, the use of platform-based product family design of assembled products has been reconceptualised into a framework of platform-based design of non-assembled products for the process industries. As a point of departure, platform-based design is defined as shared logic in a company's activities, and a “function-based” leveraging strategy is employed to identify non-assembled products with similar characteristics and commonalities among product families, related production processes and raw materials. It is proposed that a production platform philosophy and platform-based design of non-assembled products should rely on Product platforms, Process platforms and Raw-material platforms that are well-integrated into common Production platforms, in an end-to-end perspective. However, platform-based design of non-assembled products may differ depending on whether company production relies solely on a captive raw material base or on purchased raw materials on the open market, or on both. The congruence of the development of Production platforms with the QFD methodology and House of Quality was noted in this study, as well as the simplicity of using the methodology on homogeneous products compared to multi-level hierarchical assembled products. It is argued that the proposed conceptual framework can be used in internal company discussions and reviews whether and how such an approach in product innovation can be a fruitful avenue to explore and adapt.  相似文献   

20.
In this paper we consider the issue of unit root testing in cross-sectionally dependent panels. We consider panels that may be characterized by various forms of cross-sectional dependence including (but not exclusive to) the popular common factor framework. We consider block bootstrap versions of the group-mean (Im et al., 2003) and the pooled (Levin et al., 2002) unit root coefficient DF tests for panel data, originally proposed for a setting of no cross-sectional dependence beyond a common time effect. The tests, suited for testing for unit roots in the observed data, can be easily implemented as no specification or estimation of the dependence structure is required. Asymptotic properties of the tests are derived for T going to infinity and N finite. Asymptotic validity of the bootstrap tests is established in very general settings, including the presence of common factors and cointegration across units. Properties under the alternative hypothesis are also considered. In a Monte Carlo simulation, the bootstrap tests are found to have rejection frequencies that are much closer to nominal size than the rejection frequencies for the corresponding asymptotic tests. The power properties of the bootstrap tests appear to be similar to those of the asymptotic tests.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号