首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 647 毫秒
1.
Keenan et al. (J Risk Uncertain 24:264–277, 2002) introduced a measure of downside risk aversion (third-order risk aversion), and in Theorem 1, they showed four equivalent definitions of increased downside risk aversion. This result is thought as a higher-order extension of Theorem 3 in Diamond et al. (J Econ Theory 8:337–360, 1974). We consider fourth-order risk aversion and show four equivalent definitions of increased fourth-order risk aversion. Our result is thought as a higher-order extension of Theorem 1 in Keenan et al. (2002).  相似文献   

2.
The productive efficiency of a firm can be seen as composed of two parts, one persistent and one transient. The received empirical literature on the measurement of productive efficiency has paid relatively little attention to the difference between these two components. Ahn and Sickles (Econ Rev 19(4):461–492, 2000) suggested some approaches that pointed in this direction. The possibility was also raised in Greene (Health Econ 13(10):959–980, 2004. doi:10.1002/hec.938), who expressed some pessimism over the possibility of distinguishing the two empirically. Recently, Colombi (A skew normal stochastic frontier model for panel data, 2010) and Kumbhakar and Tsionas (J Appl Econ 29(1):110–132, 2012), in a milestone extension of the stochastic frontier methodology have proposed a tractable model based on panel data that promises to provide separate estimates of the two components of efficiency. The approach developed in the original presentation proved very cumbersome actually to implement in practice. Colombi (2010) notes that FIML estimation of the model is ‘complex and time consuming.’ In the sequence of papers, Colombi (2010), Colombi et al. (A stochastic frontier model with short-run and long-run inefficiency random effects, 2011, J Prod Anal, 2014), Kumbhakar et al. (J Prod Anal 41(2):321–337, 2012) and Kumbhakar and Tsionas (2012) have suggested other strategies, including a four step least squares method. The main point of this paper is that full maximum likelihood estimation of the model is neither complex nor time consuming. The extreme complexity of the log likelihood noted in Colombi (2010), Colombi et al. (2011, 2014) is reduced by using simulation and exploiting the Butler and Moffitt (Econometrica 50:761–764, 1982) formulation. In this paper, we develop a practical full information maximum simulated likelihood estimator for the model. The approach is very effective and strikingly simple to apply, and uses all of the sample distributional information to obtain the estimates. We also implement the panel data counterpart of the Jondrow et al. (J Econ 19(2–3):233–238, 1982) estimator for technical or cost inefficiency. The technique is applied in a study of the cost efficiency of Swiss railways.  相似文献   

3.
The present study proposes an analysis process to compare and contrast different approaches to content analysis. Moving from previous findings (Righettini and Sbalchiero, ICPP—international conference on public policy, 2015), related to consumer protection in the annual speeches of Italian Presidents of AGCOM, delivered between 2000 and 2015, statistical analyses of textual data are applied on the same set of texts in order to compare and contrast results and evaluate the opportunity of integrating different approaches to enrich the results. This review of results resorts to topic based methods for classification of context units (Reinert, Les Cah l’Anal Donnees 8(2):187–198, 1983), text clustering and lexical correspondence analysis (Lebart et al., Exploring textual data, 1998) in a general framework of content analysis and “lexical worlds” exploration (Reinert, Lang Soc 66:5–39, 1993), i.e., the identification of main topics and words used by AGCOM Presidents to talk about consumer protection. Results confirm the strengths and opportunities of topics detection approach and shed light on how quantitative methods might become useful to political scientists when available policy documents increase in number and size. One methodological innovation of this article is that it supplements the use of word categories in traditional content analysis with an automated topics analysis which exceeds the problems of reliability, replicability, and inferential circularity.  相似文献   

4.
The literature on neighbor designs as introduced by Rees (Biometrics 23:779–791, 1967) is mainly devoted to construction methods, providing few results on their statistical properties, such as efficiency and optimality. A review of the available literature, with special emphasis on the optimality of neighbor designs under various fixed effects interference models, is given in Filipiak and Markiewicz (Commun Stat Theory Methods 46:1127–1143, 2017). The aim of this paper is to verify whether the designs presented by Filipiak and Markiewicz (2017) as universally optimal under fixed interference models are still universally optimal under models with random interference effects. Moreover, it is shown that for a specified covariance matrix of random interference effects, a universally optimal design under mixed interference models with block effects is universally optimal over a wider class of designs. In this paper the method presented by Filipiak and Markiewicz (Metrika 65:369–386, 2007) is extended and then applied to mixed interference models without or with block effects.  相似文献   

5.
Sanyu Zhou 《Metrika》2017,80(2):187-200
A simultaneous confidence band is a useful statistical tool in a simultaneous inference procedure. In recent years several papers were published that consider various applications of simultaneous confidence bands, see for example Al-Saidy et al. (Biometrika 59:1056–1062, 2003), Liu et al. (J Am Stat Assoc 99:395–403, 2004), Piegorsch et al. (J R Stat Soc 54:245–258, 2005) and Liu et al. (Aust N Z J Stat 55(4):421–434, 2014). In this article, we provide methods for constructing one-sided hyperbolic imultaneous confidence bands for both the multiple regression model over a rectangular region and the polynomial regression model over an interval. These methods use numerical quadrature. Examples are included to illustrate the methods. These approaches can be applied to more general regression models such as fixed-effect or random-effect generalized linear regression models to construct large sample approximate one-sided hyperbolic simultaneous confidence bands.  相似文献   

6.
This study aims to analyze the link between the construction of an effective psychological contract with the organization and the success of the socialization process. To this purpose 241 employees of a Call Center organization have been contacted. A questionnaire composed by measures of Organizational Socialization (Haueter et al. Journal of Vocational Behavior, 63, 20–39, 2003), Psychological Contract (Rousseau 1995), Job Satisfaction (Wanous et al. Journal of Applied Psychology, 82, 247–252, 1997) and Organizational Committment (Allen and Meyer 1990) was administered. Results have underlined that organizational socialization may influence the development of the psychological contract thus determining job satisfaction and organizational commitment. This research has been developed in an interdisciplinary perspective, taking into account the peculiarity of the Italian legal framework. In this regard, the analysis has been focused on how the E.U. flexicurity strategy has been implemented in Italy, according to the recent reform of labour market regulation (2012–13) and on the specific regulations introduced for call centres.  相似文献   

7.
The stochastic search variable selection proposed by George and McCulloch (J Am Stat Assoc 88:881–889, 1993) is one of the most popular variable selection methods for linear regression models. Many efforts have been proposed in the literature to improve its computational efficiency. However, most of these efforts change its original Bayesian formulation, thus the comparisons are not fair. This work focuses on how to improve the computational efficiency of the stochastic search variable selection, but remains its original Bayesian formulation unchanged. The improvement is achieved by developing a new Gibbs sampling scheme different from that of George and McCulloch (J Am Stat Assoc 88:881–889, 1993). A remarkable feature of the proposed Gibbs sampling scheme is that, it samples the regression coefficients from their posterior distributions in a componentwise manner, so that the expensive computation of the inverse of the information matrix, which is involved in the algorithm of George and McCulloch (J Am Stat Assoc 88:881–889, 1993), can be avoided. Moreover, since the original Bayesian formulation remains unchanged, the stochastic search variable selection using the proposed Gibbs sampling scheme shall be as efficient as that of George and McCulloch (J Am Stat Assoc 88:881–889, 1993) in terms of assigning large probabilities to those promising models. Some numerical results support these findings.  相似文献   

8.
In this paper the unit root tests proposed by Dickey and Fuller (DF) and their rank counterpart suggested by Breitung and Gouriéroux (J Econom 81(1): 7–27, 1997) (BG) are analytically investigated under the presence of additive outlier (AO) contaminations. The results show that the limiting distribution of the former test is outlier dependent, while the latter one is outlier free. The finite sample size properties of these tests are also investigated under different scenarios of testing contaminated unit root processes. In the empirical study, the alternative DF rank test suggested in Granger and Hallman (J Time Ser Anal 12(3): 207–224, 1991) (GH) is also considered. In Fotopoulos and Ahn (J Time Ser Anal 24(6): 647–662, 2003), these unit root rank tests were analytically and empirically investigated and compared to the DF test, but with outlier-free processes. Thus, the results provided in this paper complement the studies of the previous works, but in the context of time series with additive outliers. Equivalently to DF and Granger and Hallman (J Time Ser Anal 12(3): 207–224, 1991) unit root tests, the BG test shows to be sensitive to AO contaminations, but with less severity. In practical situations where there would be a suspicion of additive outlier, the general conclusion is that the DF and Granger and Hallman (J Time Ser Anal 12(3): 207–224, 1991) unit root tests should be avoided, however, the BG approach can still be used.  相似文献   

9.
Schwartz in (Nous,7, 1972, Definition, 3) introduces a generalization of the Condorcet criterion, which is the classical approach to rational choice in the context of cycles, and he defines the Schwartz set. Deb (J Econ Theory 16:103–110, 1977) shows that the Schwartz set consists of the maximal elements according to the transitive closure of the asymmetric part of a binary relation corresponding to a choice process or representing the decision maker’s preferences. This note provides a short and simple proof of Deb’s theorem on the characterization of the Schwartz set.  相似文献   

10.
We study the optimal dynamic portfolio exposure to predictable default risk, taking inspiration from the search for yield by means of defaultable assets observed before the 2007–2008 crisis and in its aftermath. Under no arbitrage, default risk is compensated by an ‘yield pickup’ that can strongly attract aggressive investors via an investment-horizon effect in their optimal non-myopic portfolios. We show it by stating the optimal dynamic portfolio problem of Kim and Omberg (Rev Financ Stud 9:141–161, 1996) for a defaultable risky asset and by rigorously proving the existence of nirvana-type solutions. We achieve such a contribution to the portfolio optimization literature by means of a careful, closed-form-yielding adaptation to our defaultable asset setting of the general convex duality approach of Kramkov and Schachermayer (Ann Appl Probab 9(3):904–950, 1999; Ann Appl Probab 13(4):1504–1516, 2003).  相似文献   

11.
Following Kojima and Ünver (Econ Theory 55(3):515–544, 2014) and Afacan (Math Soc Sci 66(2):176–179, 2013), this paper provides two characterizations of the Boston school choice mechanism determined by the student-proposing immediate acceptance algorithm. A mechanism is the Boston mechanism if and only if it satisfies one of the following two groups of axioms: favoring higher ranks and weak fairness; favoring higher ranks, rank monotonicity, and rank rationality.  相似文献   

12.
In 2007 Nicholas Stern’s Review (in Science 317:201–202, 2007) estimated that global GDP would shrink by 5–20% due to climate change which brought forth calls to reduce emissions by 30–70% in the next 20 years. Stern’s results were contested by Weitzman (in J Econ Lit XLV(3):703–724, 2007) who argued for more modest reductions in the near term, and Nordhaus (in Science 317:201–202, 2007) who questioned the low discount rate and coefficient of relative risk aversion employed in the Stern Review, which caused him to argue that ‘the central question about global-warming policy—how much how, how fast, and how costly—remain open.’ We present a simulation model developed by Färe et al. (in Time substitution with application to data envelopment analysis, 2009) on intertemporal resource allocation that allows us to shine some light on these questions. The empirical specification here constrains the amount of undesirable output a country can produce over a given period by choosing the magnitude and timing of those reductions. We examine the production technology of 28 OECD countries over 1992–2006, in which countries produce real GDP and CO2 using capital and labor and simulate the magnitude and timing necessary to be in compliance with the Kyoto Protocol. This tells us ‘how fast’ and ‘how much’. Comparison of observed GDP and simulated GDP with the emissions constraints tells us ‘how costly’. We find these costs to be relatively low if countries are allowed reallocate production decision across time, and that emissions should be cut gradually at the beginning of the period, with larger cuts starting in 2000.  相似文献   

13.
The UK clothing industry has seen the extensive offshoring of manufacturing, which has created fragmented global supply chains; these present a range of supply issues and challenges, including many related to sustainability. Reshoring is a reversion of a previous offshoring decision, thereby ‘bringing manufacturing back home’ (Gray et al. J Supply Chain Management 49(2):27–33, 2013), and can be motivated by increased costs and supply management problems. While not a new phenomenon, the reshoring of activities is growing in practice and there is an imperative for academic research (Fratocchi et al. J Purch Supply Manag 20:54–59, 2014). Through an in-depth longitudinal case study, this paper explores how sustainability can be addressed through reshoring; the studied UK-based clothing SME has strong principles and is explicitly committed to bringing its supply chain ‘home’. There is a recognised need for more OM research using a social lens (Burgess and Singh Oper Manag Res 5:57–68, 2012), so Social Network Theory (SNT) is employed to examine the reshoring decision-making process. SNT applies a relational, qualitative approach to understand the interactions between network actors, and focuses on the types and strengths of relationships and how they provide context for decisions (Galaskiewicz J Supply Chain Manag 47(1):4–8, 2011). The findings demonstrate the importance of socially complex, long-term relationships in managing a sustainable supply network. These relationships contribute to the resources that a firm can harness in its supply practices, and SNT extends this with its emphasis on the strength of ties with suppliers, and the trust, reciprocity and shared meanings it engenders. For the studied firm these advantages are derived through its localised supply chain, and collaborative supplier relationships, and its progressive reshoring of activities is integral to achieving its sustainability principles.  相似文献   

14.
The study reported in this article examines simultaneously the impact of individual entrepreneurship and collective entrepreneurship on innovation in small business. It intends to address the weakness in previous entrepreneurship research that either only focuses on the individual entrepreneur's role in innovation (Miller 1983 Management Science, 29:770-791), or only stresses the importance of collective entrepreneurship (Reich 1987 Harvard Business Review, 65 (3), 22-83; Stewart 1989). The results of structural equation modeling analysis of data from more than 200 small businesses show that both individual entrepreneur(s) and the collective contribute to innovation in small business. Analysis results also reveal the complex relationships between the two types of entrepreneurship in terms of their impact on innovation in small business. Factors that contribute to collective entrepreneurship were found to contribute to individual entrepreneurship, while factors that are often associated with individual entrepreneurship were found to have negative impact on collective entrepreneurship. Communication among members of the small business, which was found to directly contribute to collective entrepreneurship, was found to contribute to individual entrepreneur's knowledge of emerging markets, products, and technologies. In contrast, centralized decision making, which was found to have direct negative impact on innovation, was found to have negative impact on collaboration and communication. Implications of the study are discussed.  相似文献   

15.
16.
Cyrille Joutard 《Metrika》2017,80(6-8):663-683
We establish strong large deviation results for an arbitrary sequence of random vectors under some assumptions on the normalized cumulant generating function. In other words, we give asymptotic approximations for a multivariate tail probability of the same kind as the one obtained by Bahadur and Rao (Ann Math Stat 31:1015–1027, 1960) for the sample mean (in the one-dimensional case). The proof of our results follows the same lines as in Chaganty and Sethuraman (J Stat Plan Inference, 55:265–280, 1996). We also present three statistical applications to illustrate our results, the first one dealing with a vector of independent sample variances, the second one with a Gaussian multiple linear regression model and the third one with the multivariate Nadaraya–Watson estimator. Some numerical results are also presented for the first two applications.  相似文献   

17.
18.
This paper provides empirical evidence suggesting that fundamentals matter for stock price fluctuations once temporal instability underpinning stock price-relations is accounted for. Specifically, this study extends the out-of sample forecasting methodology of Meese and Rogoff (J Int Econ 14:3–24 (1983)) to the stock market after explicitly testing for parameter nonconstancy using recursive techniques. The predictive ability of a present value model based on Imperfect Knowledge Economics (IKE) is found to match that of the pure random walk benchmark at short forecasting horizons and to perform significantly better at medium to longer-run horizons based on conventional measures of predictability and direction of change statistics. In addition, the presence of a cointegrating relation is found only within regimes of statistical parameter constancy. Augmenting the MR methodology in a piecewise linear fashion yields empirical results in favor of a fundamentals-based account of stock price behavior overturning the recent results of Flood and Rose (2010).  相似文献   

19.
In an influential paper, Färe and Lovell (J Econ Theory 19:150–162, 1978) proposed an (input based) technical efficiency index designed to correct two fundamental inadequacies of the Debreu-Farrell index: its failure to satisfy (1) indication (the index is equal to 1 if and only if the input bundle is technically efficient) and (2) weak monotonicity (an increase in any one input quantity cannot increase the value of the index). Färe et al. (1985) extended the index to measure efficiency in the full space of input and output quantities. Unfortunately, this index fails to satisfy not only indication and monotonicity at the boundary (of output space), but also weak monotonicity. We show, however, that a simple modification of the index corrects these flaws. To demonstrate the tractability of our proposal, we apply it to baseball batting performance, in which zero outputs occur frequently.  相似文献   

20.
This paper investigates the price and volatility relationship in European short-term interest rate markets. Cointegration analysis is used to analyse the long and short run relationship and a GARCH BEKK model is estimated to analyse the volatility transmission between the markets. The stability of the long run relationship is also examined using Bai and Perron (Econometrica 66(1),47–78, 1998, J Appl Econ 18(1):1–22, 2003) structural break methodology. The results show that the relationship between the EURIBOR spot deposit rate and the EURIBOR future contract has changed significantly since 2001 and several structural breaks are present in the 13 year sample period. During periods where there is a long run relationship present the spot deposit rate generally leads the future rate in price discovery. In the short run there is bi-directional causality present between the markets. There is also significant evidence of volatility transmission from the spot market to the futures market throughout the sample period.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号