共查询到20条相似文献,搜索用时 827 毫秒
1.
An important issue when conducting stochastic frontier analysis is how to choose a proper parametric model, which includes
choices of the functional form of the frontier function, distributions of the composite errors, and also the exogenous variables.
In this paper, we extend the likelihood ratio test of Vuong, Econometrica 57(2):307–333, (1989) and Takeuchi’s, Suri-Kagaku (Math Sci) 153:12–18, (1976) model selection criterion to the stochastic frontier models. The most attractive feature of this test is that it can not
only be used for testing a non-nested model, but also still be applicable even when the general model is misspecified. Finally,
we also demonstrate how to apply this test to the Indian farm data used by Battese and Coelli, J Prod Anal 3:153–169, (1992), Empir Econ 20(2):325–332, (1995) and Alvarez et al., J Prod Anal 25:201–212, (2006). 相似文献
2.
S. C. Richardson K. Politikou M. Terzidou Z. Maka A. Kokkevi 《Quality and Quantity》2006,40(1):121-127
This study examines issues of data quality in a survey conducted in a nationwide probability sample of 8985 students in the
last four years of high school education in Greece. Respondents completed an extensive questionnaire whose main topic of investigation
was the use of licit and illicit substances (tobacco, alcohol, cannabis, etc.). We examined the effect of sex, age and school
performance on data quality. Specifically, we related these factors to the probabilities of correctly observing filters in
the questionnaire, of certain inconsistencies in responses, of reporting difficulty in understanding the questions and being
able to answer honestly. It was found that all these factors have strong effects. Boys’ responses presented more problems
than girls’ (median odds ratio in a series of separate logistic regressions = 1.35) and younger respondents’ more than older
(median odds ratio 1.34 for ages 13–14 and 1.06 for ages 15–16, compared to ages 17–18). The strongest effect was related
to school performance compared to the best students (school marks 18–20), median odds ratios were 1.31 for marks 15–17, 1.76
for 12–14 and 3.25 for 10–11. Implications for questionnaire design are discussed. 相似文献
3.
In the present investigation, a new forced quantitative randomized response (FQRR) model has been proposed. Both situations
when the values of the forced quantitative response are known and unknown are studied. The forced qualitative randomized response
models due to Liu and Chow (J Am Stat Assoc 71:72–73, 1976a, Biometrics 32:607–618, 1976b) and Stem and Steinhorst (J Am Stat
Assoc 79:555–564, 1984) are shown as a special case of the situation when the value of the forced quantitative randomized
response is simply replaced by a forced “yes” response. The proposed FQRR model remains more efficient than the recent Bar-Lev
et al. (Metrika, 60:255–260, 2004), say BBB model. The relative efficiency of the proposed FQRR model with respect to the
existing competitors, like the BBB model, has been investigated under different situations. No doubt the present model will
lead to several new developments in the field of randomized response sampling. The proposed FQRR model will encourage researchers/scientists
to think more on these lines. 相似文献
4.
Ruey-Ching Hwang Jhao-Siang Siao Huimin Chung C. K. Chu 《Journal of Productivity Analysis》2011,36(3):263-273
We use a stochastic frontier model with firm-specific technical inefficiency effects in a panel framework (Battese and Coelli
in Empir Econ 20:325–332, 1995) to assess two popular probability of bankruptcy (PB) measures based on Merton model (Merton in J Financ 29:449–470, 1974) and discrete-time hazard model (DHM; Shumway in J Bus 74:101–124, 2001). Three important results based on our empirical studies are obtained. First, a firm with a higher PB generally has less
technical efficiency. Second, for an ex-post bankrupt firm, its PB tends to increase and its technical efficiency of production
tends to decrease, as the time to its bankruptcy draws near. Finally, the information content about firm’s technical inefficiency
provided by PB based on DHM is significantly more than that based on Merton model. By the last result and the fact that economic-based
efficiency measures are reasonable indicators of the long-term health and prospects of firms (Baek and Pagán in Q J Bus Econ
41:27–41, 2002), we conclude that PB based on DHM is a better credit risk proxy of firms. 相似文献
5.
Vitaly S. Guzhva Kseniya Beltsova Vladimir V. Golubev 《Journal of Economics and Finance》2010,34(1):30-45
We assess market valuation of airline convertible preferred stocks using a contingent claims valuation model that was extensively
tested by Ramanlal et al. (Rev Quant Financ Account 10:303–319, 1998). Our sample consists of 4,096 daily price observations of 11 convertible preferred stocks issued by the U.S. airlines in
1980–1991. For each convertible we estimate daily model prices for 2 years after issuance and compare them with market prices
by calculating pricing errors. While the entire sample’s mean pricing error is found to be negative 3.8%, the panel data analysis
and the mean pricing errors of the sub-samples indicate that the undervaluation is much more severe in the first 6 months
of trading. The results suggest that airlines leave about 10% on the table when they raise capital by issuing convertible
securities. 相似文献
6.
Massimo Costabile 《Decisions in Economics and Finance》2002,25(2):111-125
This paper provides a discrete time algorithm, in the framework of the Cox–Ross–Rubinstein analysis (1979), to evaluate both
Parisian options with a flat barrier and Parisian options with an exponential boundary. The algorithm is based on a combinatorial
tool for counting the number of paths of a particle performing a random walk, that remains beyond a barrier constantly for
a period strictly smaller than a pre-specified time interval. As a result, a binomial evaluation model is derived that is
very easy to implement and that produces highly accurate prices.
Received: 19 March 2001 / Accepted: 17 March 2002
The author thanks Prof. Ivar Massabó for helpful comments and discussions. This research has been partially supported by
MIUR (research on “Modelli per la Finanza Matematica”) 相似文献
7.
Fershtman and Nitzan (Eur. Econ. Rev. 35:1057–1067, 1991) presented a continuous dynamic public good game and solved the model
for feedback Nash equilibria. Wirl (Eur. J. Polit. Econ. 12:555–560, 1996) extended the model and considered nonlinear strategies.
Both models do not include uncertainty and hence neglect an important factor in the theory of public goods. We extend the
framework of Nitzan and Fershtman and include a diffusion term. We consider two cases. In the first case, the volatility of
the diffusion term is dependent on the current level of the public good. This set-up will in principle lead to the same type
of feedback strategies computed under certainty. In the second case, the volatility is dependent on the current rate of public
good provision by the agents. The results are qualitatively different. We provide a detailed discussion as well as numerical
examples. In particular, we show that in both cases uncertainty signifies the free rider effect. 相似文献
8.
We show that the recently developed non-parametric procedure for fitting the term structure of interest rates developed by
Linton, Mammen, Nielsen, and Tanggaard (J Econ 105(1):185–223, 2001) overall performs notably better than the highly flexible
McCulloch (J Finon 30:811–830, 1975) cubic spline and Fama and Bliss (Am Econ Rev 77:680–692, 1987) bootstrap methods. However,
if interest is limited to the Treasury-bill region alone then the Fama–Bliss method demonstrates superior performance. We
further show, via simulation, that using the estimated short rate from the Linton–Mammen–Nielsen–Tanggaard procedure as a
proxy for the short rate has higher precision then the commonly used proxies of the one and three month Treasury-bill rates.
It is demonstrated that this precision is important when using proxies to estimate the stochastic process governing the evolution
of the short rate 相似文献
9.
Bayo H. Lawal 《Quality and Quantity》2008,42(5):605-612
In this paper, we implement the conditional difference asymmetry model (CDAS) for square tables with nominal categories proposed
by Tomizawa et al. (J. Appl. Stat. 31(3): 271–277, 2004) with the use of the non-standard log-linear model formulation approach.
The implementation is carried out by refitting the model in the 3 × 3 table in (Tomizawa et al. J. Appl. Stat. 31(3): 271–277,
2004). We extend this approach to a larger 4 × 4 table of religious affiliation. We further calculated the measure of asymmetry
along with its asymptotic standard error and confidence bounds. The procedure is implemted with SAS PROC GENMOD but can also
be implemented in SPSS by following the discussion in (Lawal, J. Appl. Stat. 31(3): 279–303, 2004; Lawal, Qual. Quant. 38(3):
259–289, 2004). 相似文献
10.
Amitava Saha 《Metrika》2011,73(2):139-149
Eichhorn and Hayre (J Stat Plan Inference 7:307–316, 1983) introduced the scrambled response technique to gather information
on sensitive quantitative variables. Singh and Joarder (Metron 15:151–157, 1997), Gupta et al. (J Stat Plan Inference 100:239–247,
2002) and Bar-Lev et al. (Metrika 60:255–260, 2004) permitted the respondents either to report their true values on the sensitive
quantitative variable or the scrambled response and developed the optional randomized response (ORR) technique based on simple
random sample with replacement (SRSWR). While developing the ORR procedure, these authors made the assumption that the probability
of disclosing the true response or the randomized response (RR) is the same for all the individuals in a population. This
is not a very realistic assumption as in practical survey situations the probability of reporting the true value or the RR
generally varies from unit to unit. Moreover, if one generalizes the ORR method as developed by these authors relaxing the
‘constant probability’ assumption, the variance of an unbiased estimator for the population total or mean can not be estimated
as this involves the unknown parameter, ‘the probability of revealing the true response’. Here we propose a modified ORR procedure
for stratified unequal probability sampling after relaxing the assumption of ‘constant probability’ of providing the true
response. It is also demonstrated with a numerical exercise that our procedure produces better estimator for a population
total than that provided by the method suggested by the earlier authors. 相似文献
11.
We examine the asymptotic behavior of two strategyproof mechanisms discussed by Moulin for public goods – the conservative equal costs rule (CER) and the serial cost sharing rule (SCSR) – and compare their performance to that of the pivotal mechanism (PM) from the Clarke–Groves family. Allowing the individuals’ valuations for an excludable public project to be random variables, we show under very general assumptions that expected welfare loss generated by the CER, as the size of the population increases, becomes arbitrarily large. However, all moments of the SCSR’s random welfare loss asymptotically converge to zero. The PM does better than the SCSR, with its welfare loss converging even more rapidly to zero. 相似文献
12.
Shantanu Bagchi 《Journal of Economics and Finance》2011,35(1):41-70
The standard neoclassical life-cycle model predicts that individual consumption should either increase, remain constant or
fall monotonically depending on whether the market rate of return on savings is greater than, equal to or less than the discount
rate. However, empirical evidence suggests that even after controlling for economic growth and family size, household consumption
exhibits a robust hump at around age 45–55, with the ratio of peak consumption to consumption when entering the workforce
greater than 1.1. This paper extends the “overconfidence” explanation (Caliendo and Huang, J. Macroecon 30(4):1347–1369, 2008) of this macroeconomic puzzle to a calibrated general equilibrium environment. The main finding is that although it is possible
to identify parameter values under which overconfidence alone generates life-cycle consumption profiles and macro-indicators
consistent with U.S. experience, quite extreme assumptions about both the magnitude and distribution of overconfidence in
the population are generally required to obtain them. 相似文献
13.
Faraz and Parsian (Statistical Paper, 47: 569–593, 2006) have shown that the double warning lines (DWL) scheme detects process
shifts more quickly than the other variable ratio sampling schemes such as variable sample sizes (VSS), variable sampling
intervals (VSI) and variable sample sizes and sampling intervals (VSSVSI). In this paper, the DWLT2control chart for monitoring the process mean vector is economically designed. The cost model proposed by Costa and Rahim
(Journal of Applied Statistics, 28: 875–885, 2001) is used here and is minimized through a genetic algorithm (GA) approach.
Then the effects of the model parameters on the chart parameters and resulting operating loss is studied and finally a comparison
between all possible variable ratio sampling (VRS) schemes are made to choose the best option economically. 相似文献
14.
This paper generalizes Kunert and Martin’s (Ann Stat 28:1728–1742, 2000) method for finding optimal designs under a fixed
interference model, to find optimal designs under a mixed interference model. The results are based on the properties of information
matrices in fixed and mixed models given in Markiewicz (J Stat Plan Inference 59:127–137, 1997). The method is applied to
find a design which is optimal for any given variances of random neighbor effects.
Research partially supported by the KBN Grant Number 5 P03A 041 21. 相似文献
15.
Andy Neely 《Operations Management Research》2008,1(2):103-118
Commentators suggest that to survive in developed economies manufacturing firms have to move up the value chain, innovating
and creating ever more sophisticated products and services, so they do not have to compete on the basis of cost. While this
strategy is proving increasingly popular with policy makers and academics there is limited empirical evidence to explore the
extent to which it is being adopted in practice. And if so, what the impact of this servitization of manufacturing might be.
This paper seeks to fill a gap in the literature by presenting empirical evidence on the range and extent of servitization.
Data are drawn from the OSIRIS database on 10,028 firms incorporated in 25 different countries. The paper presents an analysis
of these data which suggests that: [i] manufacturing firms in developed economies are adopting a range of servitization strategies—12
separate approaches to servitization are identified; [ii] these 12 categories can be used to extend the traditional three
options for servitization—product oriented Product–Service Systems, use oriented Product–Service Systems and result oriented
Product–Service Systems, by adding two new categories “integration oriented Product–Service Systems” and “service oriented
Product–Service Systems”; [iii] while the manufacturing firms that have servitized are larger than traditional manufacturing
firms in terms of sales revenues, at the aggregate level they also generate lower profits as a % of sales; [iv] these findings
are moderated by firm size (measured in terms of numbers of employees). In smaller firms servitization appears to pay off
while in larger firms it proves more problematic; and [v] there are some hidden risks associated with servitization—the sample
contains a greater proportion of bankrupt servitized firms than would be expected.
相似文献
Andy NeelyEmail: |
16.
Vathana Ly Vath 《Decisions in Economics and Finance》2007,30(2):79-94
This paper studies the existence of a competitive market equilibrium under asymmetric information. There are two agents involved
in the trading of the risky assets: an “informed” trader and an “ordinary” trader. The market is competitive and the ordinary
agent can infer the insider information from the price dynamics of the risky assets. The insider information is considered
to be the total supply of the risky assets. The definition of market equilibrium is based on the law of supply-demand as described
by a rational expectations equilibrium of the Grossman and Stiglitz (Am Econ Rev 70:393–408, 1980) model. We show that equilibrium
can be attained by linear dynamics of an admissible price process of the risky assets for a given linear supply dynamics.
相似文献
17.
Faraz Alireza Kazemzadeh R. B. Moghadam M. B. Parsian Ahmad 《Quality and Quantity》2012,46(4):1323-1336
Faraz and Parsian (Stat Pap 47:569–593, 2006) have shown that the double warning lines (DWL) scheme detects process shifts more quickly than the other variable ratio
sampling schemes such as variable sample sizes (VSS), variable sampling intervals (VSI) and variable sample sizes and sampling
intervals (VSSVSI). In this paper, the DWL T2 control chart for monitoring the process mean vector is economically designed. The cost model proposed by Costa and Rahim
(J Appl Stat 28:875–885, 2001) is used here and is minimized through a genetic algorithm (GA) approach. Then the effects of the model parameters on the
chart parameters and resulting operating loss is studied and finally a comparison between all possible variable ratio sampling
(VRS) schemes are made to choose the best option economically. 相似文献
18.
In this paper, we take up an approach of (Lindberg, in Bernoulli, 15(2):464–474, 2009) who introduced a new parameterization
of the Black–Scholes model that allows for an easy solution of the continuous-time Markowitz mean-variance problem. We generalize
the results of (Lindberg, in Bernoulli, 15(2):464–474, 2009) to a jump-diffusion market setting and slightly correct the proof
and the assertion of the main result. Further, we demonstrate the implications of the Lindberg parameterization for the stock
price drift vector in different market settings, analyse the dependence of the optimal portfolio from jump and diffusion risk
and finally indicate how to use the method. We particularly also show how the optimal strategy can be obtained with the restricted
use of historical data. 相似文献
19.
The article examines whether the US threat perceptions defined in terms of federal government national defense outlays in
billions of constant (FY 2000) dollars change along with periodical changes in international politics between 1945 and 2007.
Three different models affecting direction of the US defense expenditures are developed. The first model are estimated by
using five link functions even though results of only two of them, complementary log–log and cauchit, are presented. As complementary
log–log produced the best results, others models are predicted by using only this function. The parameter estimates of complementary
log–log function for the first model indicate that four of these variables (Ford, Carter, Reagan and Bush Sr.) out of eleven
are significant in the category of presidents. “Truman Docrtrine/Cominform”, “Korean War”, “Vietnam War”, and “Invasion of
Iraq” also seem to be the important independent variables on empirical grounds for the first model. While “Party”, “Invasion
of Iraq”, “Vietnam War”, “Korean War”, and “Cuban Missile Crisis” constitute the important independent variables on empirical
grounds for the second model, “Korean War”, “Vietnam War”, “Invasion of Iraq”, “Truman Docrtrine/Cominform”, “The Cold War
and New World Order”, and “Cuban Missile Crisis” are important independent variables on empirical grounds for the third model.
Estimations based on these three models therefore suggest that aforementioned independent variables do indeed have effect
on the US defense expenditures. 相似文献
20.
C. Di Guilmi F. Clementi T. Di Matteo M. Gallegati 《Journal of Economic Interaction and Coordination》2008,3(1):43-57
This paper uses firm-level data recorded in the Amadeus database to investigate the distribution of labour productivity in different European countries. We find that the upper tail
of the empirical productivity distributions follows a decaying power-law, whose exponent α is obtained by a semi-parametric estimation technique recently developed by Clementi et al. [Physica A 370(1):49–53, 2006].
The emergence of “fat tails” in productivity distribution has already been detected in Di Matteo et al. [Eur Phys J B 47(3):459–466,
2005] and explained by means of a model of social network. Here we show that this model is tested on a broader sample of countries
having different patterns of social network structure. These different social attitudes, measured using a social capital indicator,
reflect in the power-law exponent estimates, verifying in this way the existence of linkages among firms’ productivity performance
and social network. 相似文献