首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Beno?t and Ok (Games Econ Behav 64:51–67, 2008) show that in a society with at least three agents any weakly unanimous social choice correspondence (SCC) is Maskin’s monotonic if and only if it is Nash-implementable via a simple stochastic mechanism (Beno?t-Ok’s Theorem). This paper fully identifies the class of weakly unanimous SCCs that are Nash-implementable via a simple stochastic mechanism endowed with Saijo’s message space specification (Saijo in Econometrica 56:693–700, 1988). It is shown that this class of SCCs is equivalent to the class of SCCs that are Nash-implementable via Beno?t-Ok’s Theorem.  相似文献   

2.
This paper investigates the relationship between secure implementability (Saijo et al. in Theor Econ 2:203–229, 2007) and full implementability in truthful strategies (Nicolò in Rev Econ Des 8:373–382, 2004). Although secure implementability is in general stronger than full implementability in truthful strategies, this paper shows that both properties are equivalent under the social choice function that satisfies non-wastefulness (Li and Xue in Econ Theory, doi:10.1007/s00199-012-0724-0) in pure exchange economies with Leontief utility functions.  相似文献   

3.
In response to a question raised by Knox Lovell, we develop a method for estimating directional output distance functions with endogenously determined direction vectors based on exogenous normalization constraints. This is reminiscent of the Russell measure proposed by Färe and Lovell (J Econ Theory 19:150–162, 1978). Moreover it is related to the slacks-based directional distance function introduced by Färe and Grosskopf (Eur J Oper Res 200:320–322, 2010a, Eur J Oper Res 206:702, 2010b). Here we show how to use the slacks-based function to estimate the optimal directions.  相似文献   

4.
We consider multiple-principal multiple-agent models of moral hazard: principals compete through mechanisms in the presence of agents who take unobservable actions. In this context, we provide a rationale for restricting principals to make use of simple mechanisms, which correspond to direct mechanisms in the standard framework of Myerson (J Math Econ 10:67–81, 1982). Our results complement those of Han (J Econ Theory 137(1):610–626, 2007) who analyzes a complete information setting where agents’ actions are fully contractible.  相似文献   

5.
In this paper we consider parametric deterministic frontier models. For example, the production frontier may be linear in the inputs, and the error is purely one-sided, with a known distribution such as exponential or half-normal. The literature contains many negative results for this model. Schmidt (Rev Econ Stat 58:238–239, 1976) showed that the Aigner and Chu (Am Econ Rev 58:826–839, 1968) linear programming estimator was the exponential MLE, but that this was a non-regular problem in which the statistical properties of the MLE were uncertain. Richmond (Int Econ Rev 15:515–521, 1974) and Greene (J Econom 13:27–56, 1980) showed how the model could be estimated by two different versions of corrected OLS, but this did not lead to methods of inference for the inefficiencies. Greene (J Econom 13:27–56, 1980) considered conditions on the distribution of inefficiency that make this a regular estimation problem, but many distributions that would be assumed do not satisfy these conditions. In this paper we show that exact (finite sample) inference is possible when the frontier and the distribution of the one-sided error are known up to the values of some parameters. We give a number of analytical results for the case of intercept only with exponential errors. In other cases that include regressors or error distributions other than exponential, exact inference is still possible but simulation is needed to calculate the critical values. We also discuss the case that the distribution of the error is unknown. In this case asymptotically valid inference is possible using subsampling methods.  相似文献   

6.
We study a keyword auction model where bidders have constrained budgets. In the absence of budget constraints, Edelman et al. (Am Econ Rev 97(1):242–259, 2007) and Varian (Int J Ind Organ 25(6):1163–1178, 2007) analyze “locally envy-free equilibrium” or “symmetric Nash equilibrium” bidding strategies in generalized second-price auctions. However, bidders often have to set their daily budgets when they participate in an auction; once a bidder’s payment reaches his budget, he drops out of the auction. This raises an important strategic issue that has been overlooked in the previous literature: Bidders may change their bids to inflict higher prices on their competitors because under generalized second-price, the per-click price paid by a bidder is the next highest bid. We provide budget thresholds under which equilibria analyzed in Edelman et al. (Am Econ Rev 97(1):242–259, 2007) and Varian (Int J Ind Organ 25(6):1163–1178, 2007) are sustained as “equilibria with budget constraints” in our setting. We then consider a simple environment with one position and two bidders and show that a search engine’s revenue with budget constraints may be larger than its revenue without budget constraints.  相似文献   

7.
This article proposes a model of a simple economy based on a set of agent-based modeling principles. The model is based on the ??trust game?? formulated by Berg et?al. (Games Econ Behav 10:122?C142, 1995), and anticipates a random matching of partners taking in to account adaptive agent behavior. Simulation in the NetLogo programming environment, using profile distributions obtained from empirical studies, has shown the most successful agents to posses low parameters of trust in the role of Sender and high parameters of trustworthiness in the role of Receiver.  相似文献   

8.
Agent based models are very widely used in different disciplines. In financial markets, they can be used to explain well known features called stylised facts and fit statistical properties of data. For this reason, they can model price movements better than standard models using gaussianity. Calibration and validation are essential issues in agent-based modeling. However, calibrating such models is not yet sufficiently considered in the literature. In this paper, a Nelder–Mead simplex algorithm coupled with threshold accepting algorithm (Gilli and Winker in Comput Stat Data Anal 42:299–312, 2003) and a genetic algorithm have been implemented to calibrate the model presented by Farmer and Joshi (J Econ Behav Org 49:149–171, 2002) and the outcomes have been compared and discussed. The data used are closing prices of S&P500 Composite index and a particular attention has been devoted to the choice of the objective function.  相似文献   

9.
Classical optimal strategies are notorious for producing remarkably volatile portfolio weights over time when applied with parameters estimated from data. This is predominantly explained by the difficulty to estimate expected returns accurately. In Lindberg (Bernoulli 15:464–474, 2009), a new parameterization of the drift rates was proposed with the aim to circumventing this difficulty, and a continuous time mean–variance optimal portfolio problem was solved. This approach was further developed in Alp and Korn (Decis Econ Finance 34:21–40, 2011a) to a jump-diffusion setting. In the present paper, we solve a different portfolio problem under the market parameterization in Lindberg (Bernoulli 15:464–474, 2009). Here, the admissible investment strategies are given as the amounts of money to be held in each stock and are allowed to be adapted stochastic processes. In the references above, the admissible strategies are the deterministic and bounded fractions of the total wealth. The optimal strategy we derive is not the same as in Lindberg (Bernoulli 15:464–474, 2009), but it can still be viewed as investing equally in each of the n Brownian motions in the model. As a consequence of the problem assumptions, the optimal final wealth can become non-negative. The present portfolio problem is solved also in Alp and Korn (Submitted, 2011b), using the L 2-projection approach of Schweizer (Ann Probab 22:1536–1575, 1995). However, our method of proof is direct and much easier accessible.  相似文献   

10.
The purpose of this note is twofold. First, we survey the study of the percolation phase transition on the Hamming hypercube $\{0,1\}^{m}$ obtained in the series of papers (Borgs et al. in Random Struct Algorithms 27:137–184, 2005; Borgs et al. in Ann Probab 33:1886–1944, 2005; Borgs et al. in Combinatorica 26:395–410, 2006; van der Hofstad and Nachmias in Hypercube percolation, Preprint 2012). Secondly, we explain how this study can be performed without the use of the so-called “lace expansion” technique. To that aim, we provide a novel simple proof that the triangle condition holds at the critical probability.  相似文献   

11.
This paper develops a parametric decomposition framework of labor productivity growth relaxing the assumption of labor-specific efficiency. The decomposition analysis is applied to a sample of 121 developed and developing countries during the 1970–2007 period drawn from the recently updated Penn World Tables and Barro and Lee (A new data set of educational attainment in the world 1950–2010. NBER Working Paper No. 15902, 2010) educational databases. A generalized Cobb–Douglas functional specification is used taking into account differences in technological structures across groups of countries to approximate aggregate production technology using Jorgenson and Nishimizu (Econ J 88:707–726, 1978) bilateral model of production. The measurement of labor efficiency is based on Kopp’s (Quart J Econ 96:477–503, 1981) orthogonal non-radial index of factor-specific efficiency modified in a parametric frontier framework. The empirical results indicate that the weighted average annual rate of labor productivity growth was 1.239 % over the period analyzed. Technical change was found to be the driving force of labor productivity, while improvements in human capital and factor intensities account for the 19.5 and 12.4 % of that productivity growth, respectively. Finally, labor efficiency improvements contributed by 9.8 % to measured labor productivity growth.  相似文献   

12.
While the high prevalence of mental illness in workplaces is more readily documented in the literature than it was ten or so years ago, it continues to remain largely within the medical and health sciences fields. This may account for the lack of information about mental illness in workplaces (Dewa et al. Healthcare Papers 5:12–25, 2004) by operational managers and human resource departments even though such illnesses effect on average 17 % to 20 % of employees in any 12-month period (MHCC 2012; SAMHSA 2010; ABS 2007). As symptoms of mental illness have the capacity to impact negatively on employee work performance and/or attendance, the ramifications on employee performance management systems can be significant, particularly when employees choose to deliberately conceal their illness, such that any work concerns appear to derive from issues other than illness (Dewa et al. Healthcare Papers 5:12–25, 2004; De Lorenzo 2003). When employee non-disclosure of a mental illness impacts negatively in the workplace, it presents a very challenging issue in relation to performance management for both operational managers and human resource staff. Without documented medical evidence to show that impaired work performance and/or attendance is attributable to a mental illness, the issue of performance management arises. Currently, when there is no documented medical illness, performance management policies are often brought into place to improve employee performance and/or attendance by establishing achievable employee targets. Yet, given that in any twelve-month period at least a fifth of the workforce sustains a mental illness (MHCC 2012; SAMHSA 2010; ABS 2007), and that non-disclosure is significant (Barney et al. BMC Public Health 9:1–11, 2009; Munir et al. Social Science & Medicine 60:1397–1407, 2005) such targets may be unachievable for employees with a hidden mental illness. It is for these reasons that this paper reviews the incidence of mental illness in western economies, its costs, and the reasons why it is often concealed and proposes the adoption of what are termed ‘Buffer Stage’ policies as an added tool that organisations may wish to utilise in the management of hidden medical illnesses such as mental illness.  相似文献   

13.
In this paper, we discuss in a general framework the design-based estimation of population parameters when sensitive data are collected by randomized response techniques. We show in close detail the procedure for estimating the distribution function of a sensitive quantitative variable and how to estimate simultaneously the population prevalence of individuals bearing a stigmatizing attribute and the distribution function for the members belonging to the hidden group. The randomized response devices by Greenberg et al. (J Am Stat Assoc 66:243–250, 1971), Franklin (Commun Stat Theory Methods 18:489–505, 1989), and Singh et al. (Aust NZ J Stat 40:291–297 1998) are here considered as data-gathering tools.  相似文献   

14.
The recombining binomial tree approach, which has been initiated by Cox et?al. (J Financ Econ 7: 229?C263, 1979) and extended to arbitrary diffusion models by Nelson and Ramaswamy (Rev Financ Stud 3(3): 393?C430, 1990) and Hull and White (J Financ Quant Anal 25: 87?C100, 1990a), is applied to the simultaneous evaluation of price and Greeks for the amortized fixed and variable rate mortgage prepayment option. We consider the simplified binomial tree approximation to arbitrary diffusion processes by Costabile and Massabo (J Deriv 17(3): 65?C85, 2010) and analyze its numerical applicability to the mortgage valuation problem for some Vasicek and CIR-like interest rate models. For fixed rates and binomial trees with about thousand steps, we obtain very good results. For the Vasicek model, we also compare the closed-form analytical approximation of the callable fixed rate mortgage price by Xie (IAENG Int J Appl Math 39(1): 9, 2009) with its binomial tree counterpart. With respect to the binomial tree values one observes a systematic underestimation (overestimation) of the callable mortgage price (prepayment option price) analytical approximation. This numerical discrepancy increases at longer maturities and becomes impractical for a valuable estimation of the prepayment option price.  相似文献   

15.
This paper solves an optimal insurance design problem in which both the insurer and the insured are subject to Knightian uncertainty about the loss distribution. The Knightian uncertainty is modeled in a multi-prior g-expectation framework. We obtain an endogenous characterization of the optimal indemnity that extends classical theorems of Arrow (Essays in the Theory of Risk Bearing. Markham, Chicago 1971) and Raviv (Am Econ Rev 69(1):84–96, 1979) in the classical situation. In the presence of Knightian uncertainty, it is shown that the optimal insurance contract is not only contingent on the realized loss but also on another source of uncertainty coming from the ambiguity.  相似文献   

16.
This paper proposes a new two-step stochastic frontier approach to estimate technical efficiency (TE) scores for firms in different groups adopting distinct technologies. Analogous to Battese et al. (J Prod Anal 21:91–103, 2004), the metafrontier production function allows for calculating comparable TE measures, which can be decomposed into group specific TE measures and technology gap ratios. The proposed approach differs from Battese et al. (J Prod Anal 21:91–103, 2004) and O’Donnell et al. (Empir Econ 34:231–255, 2008) mainly in the second step, where a stochastic frontier analysis model is formulated and applied to obtain the estimates of the metafrontier, instead of relying on programming techniques. The so-derived estimators have the desirable statistical properties and enable the statistical inferences to be drawn. While the within-group variation in firms’ technical efficiencies is frequently assumed to be associated with firm-specific exogenous variables, the between-group variation in technology gaps can be specified as a function of some exogenous variables to take account of group-specific environmental differences. Two empirical applications are illustrated and the results appear to support the use of our model.  相似文献   

17.
Hitchcock (Synthese 97:335–364, 1993) argues that the ternary probabilistic theory of causality meets two problems due to the problem of disjunctive factors, while arguing that the unanimity probabilistic theory of causality, which is founded on the binary contrast, does not meet them. Hitchcock also argues that only the ternary theory conveys the information about complex relations of causal relevance. In this paper, I show that Eells’ solution (Probabilistic causality, Cambridge University Press, Cambridge, 1991), which is founded on the unanimity theory, meet the two problems. I also show that the unanimity theory too reveals complex relations of causal relevance. I conclude that the two probabilistic theories of causality carve up the same causal structure in two formally different and conceptually consistent ways. Hitchcock’s ternary theory inspires several major philosophers (Maslen, Causation and counterfactuals, pp. 341–357. MIT Press, Cambridge, 2004; Schaffer, Philos Rev 114, 297–328, 2005; Northcott, Phil Stud 139, 111–123, 2007; Hausman, The place of probability in science: In honor of Eelleys Eells (1953–2006), pp. 47–64, Springer, Dordrecht, 2010) who have recently developed the ternary theory or the quaternary theory. This paper leads them to reconsider the relation between the ternary theory and the binary theory.  相似文献   

18.
We analyze the implications of Nash’s (Econometrica 18:155–162, 1950) axioms in ordinal bargaining environments; there, the scale invariance axiom needs to be strenghtened to take into account all order-preserving transformations of the agents’ utilities. This axiom, called ordinal invariance, is a very demanding one. For two-agents, it is violated by every strongly individually rational bargaining rule. In general, no ordinally invariant bargaining rule satisfies the other three axioms of Nash. Parallel to Roth (J Econ Theory 16:247–251, 1977), we introduce a weaker independence of irrelevant alternatives (IIA) axiom that we argue is better suited for ordinally invariant bargaining rules. We show that the three-agent Shapley–Shubik bargaining rule uniquely satisfies ordinal invariance, Pareto optimality, symmetry, and this weaker IIA axiom. We also analyze the implications of other independence axioms.  相似文献   

19.
Sangun Park 《Metrika》2014,77(5):609-616
The representation of the entropy in terms of the hazard function and its extensions have been studied by many authors including Teitler et al. (IEEE Trans Reliab 35:391–395, 1986). In this paper, we consider a representation of the Kullback–Leibler information of the first \(r\) order statistics in terms of the relative risk (Park and Shin in Statistics, 2012), the ratio of hazard functions, and extend it to the progressively Type II censored data. Then we study the change in Kullback–Leibler information of the first \(r\) order statistics according to \(r\) and discuss its relation with Fisher information in order statistics.  相似文献   

20.
The aim of the present study is to define quality entropy as well as to illustrate some of its properties. The simulation of the mathematical model for quality entropy shall be performed by means of specialized software for mathematical problem simulation, such as Microsoft Excel that we have employed for this particular study. Our aim is to prove that quality entropy may be expanded to the notions of Markov source of quality and Bernoulli source of quality, by analogy with the Markov and Bernoulli sources employed in information theory. Likewise, the present study delineates some aspects regarding tolerance to quality entropy. The subject of entropy and its application of the management of quality has been approached by other authors as well (Dinu and Vod?, Revista Calitatea-acces la succes, anul 8(4): 60–61, 2007; Dinu, Revista Calitatea-acces la succes, anul 8(5): 62–63, 2007; Georgescu-Roegen, Legea Entropiei ?i Procesul Economic, 1979; Stamatiu, Proceedings of the 7th International Conference on Quality, Reability and Maintainabilty, 2000; Stamatiu, Proceeding of the 18th International Conference on Quality, Reliability and Maintainability, 2002). Through our transdisciplinary approach, we would like to contribute to the development of this subject.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号