首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Studies of childhood development have suggested human capital is accumulated in complex and nonlinear ways. Nonetheless, empirical analyses of this process often impose a linear functional form. This paper investigates which technology assumptions matter in quantitative models of human capital production. I propose a general-to-restricted procedure to test the production technology, placing constraints on a modified McCarthy function, from which transcendental, constant elasticity of substitution, log-linear and linear models are obtained as special cases. Applying the procedure to data on child height from the Young Lives surveys, as well as cognitive skills, I find that the technology of human capital production is neither log-linear nor linear-in-parameters; rather, past and present inputs act as complements. I recommend that maintained hypotheses underlying functional form choices should be tested on a routine basis.  相似文献   

2.
Professional service firms (PSFs) play an important role in the knowledge‐based economy. Their success is highly dependent on their people, the knowledge resources they possess, and how they use these resources. However, how to systematically manage human resources to attain high performance is not fully understood. This study addresses this issue by investigating the linkage mechanisms through which high‐performance work systems (HPWS) influence the performance of PSFs. We integrate resource‐based and dynamic capability theories in order to identify and investigate two intervening mechanisms that link HR practices to firm performance. The first mechanism is the intellectual capital resources comprising the human, social, and organizational capital that HPWS create. The second mechanism is the uses to which both HPWS and resources can be applied, operationalized as organizational ambidexterity, the simultaneous exploitation of existing knowledge and exploration of new knowledge. These mechanisms are hypothesized to link HPWS to firm performance in the form of a practices‐resources‐uses‐performance linkage model. Results from a longitudinal study of 93 accounting firms support this linkage model. © 2015 Wiley Periodicals, Inc.  相似文献   

3.
Rationalizing non‐participation as a resource deficiency in the household, this paper identifies strategies for milk‐market development in the Ethiopian highlands. The additional amounts of covariates required for positive marketable surplus—‘distances‐to market’—are computed from a model in which production and sales are correlated; sales are left‐censored at some unobserved threshold; production efficiencies are heterogeneous; and the data are in the form of a panel. Incorporating these features into the modeling exercise is important because they are fundamental to the data‐generating environment. There are four reasons. First, because production and sales decisions are enacted within the same household, both decisions are affected by the same exogenous shocks, and production and sales are therefore likely to be correlated. Second, because selling involves time and time is arguably the most important resource available to a subsistence household, the minimum sales amount is not zero but, rather, some unobserved threshold that lies beyond zero. Third, the potential existence of heterogeneous abilities in management, ones that lie latent from the econometrician's perspective, suggest that production efficiencies should be permitted to vary across households. Fourth, we observe a single set of households during multiple visits in a single production year. The results convey clearly that institutional and production innovations alone are insufficient to encourage participation. Market‐precipitating innovation requires complementary inputs, especially improvements in human capital and reductions in risk. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

4.
This paper proposes a cointegration approach to testing the validity long‐run equilibrium in production, where capital and labour are taken as quasi‐fixed inputs. Previous studies consider only capital as the quasi‐fixed input and do not take account of the time series properties of the variables, assuming implicitly that they are stationary. The canonical cointegrating regressions (CCR) procedure is employed to test for cointegration in both the single‐equation and the seemingly unrelated regressions framework, and long‐run equilibrium conditions are tested. The evidence from US manufacturing reveals that capital and labour are not fully adjusted to their long‐run optimal values, casting doubt on the long‐run equilibrium hypothesis. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

5.
I find that when a reseller with market power serves an airline company and only linear contracts are feasible, the airline prefers that the reseller utilizes the Name‐Your‐Own‐Price (NYOP) (a la Priceline) instead of the Posted Price (PP) (a la Hotwire) model. Essentially, the airline can better extract the surplus of the reseller if power over pricing is in the hands of numerous consumers, each bidding according to her preferences, instead of being concentrated in the hands of the reseller. Introducing two part tariff contracts or competition among resellers eliminates the distinction between the two pricing models. Either form of pricing generates the same outcome as vertical integration of the airline with the downstream market of resellers.  相似文献   

6.
Abstract In this paper I make an attempt to describe, discuss and extend a few aspects of the rich mathematical tapestry that can be woven with rigorous notions of non‐linear dynamics, complexity and randomness, in terms of algorithmic mathematics. It is a tapestry that I try to weave with economic analysis, economic theory and economic modelling in mind. All three notions – that is, non‐linear dynamics, complexity and randomness – have a rich conceptual, modelling or analytic tradition in core areas of economic theory, both at the micro and macro levels. It is the algorithmic foundation I try to provide for them that could be considered the novel contribution in this paper. Once the algorithmic foundations are in place, it is, for example, almost natural to consider the famed difficulties of obtaining closed form solutions for non‐linear, complex or random dynamic models in economics almost a trivial vestige of a pre‐simulation era in mathematical modelling.  相似文献   

7.
In many industries, broad cross‐license agreements are considered a useful method to obtain freedom to operate and to avoid patent litigation. In this paper, I study firm incentives to sign a broad cross‐license as well as the duration of broad cross‐license negotiations. I develop a model of bargaining with learning, which predicts that two firms will enter a broad cross‐license agreement only if their capital intensities are large enough. The model also predicts faster negotiations when firms have high capital intensities and when the frequency of future disputes is low. I confirm these predictions empirically using a novel data set on cross‐licensing and litigation in the US semiconductor industry.  相似文献   

8.
The aim of this study is to investigate the elements of organizational career management (OCM) that can lead to strong organizational performance. The growing unpredictability of careers requires a different organizational approach of careers. Yet, new career models all focus on the individual as the central actor, leaving the role of the organization rather underdeveloped. Based on a combined perspective integrating insights from the literature on careers, high performance work systems, and idiosyncratic deals (I‐deals), we address four dimensions of OCM: supportive and developmental practices, development I‐deals, individual responsibility, and consensus. We study their relationships with company performance, thereby including the firm's human capital composition. Surveys were administered to the HR directors of 293 organizations. We apply a relatively new method, fsQCA (fuzzy‐set qualitative comparative analysis), and complement this with more conventional structural equation modeling (SEM). The SEM analyses suggest that only supportive and developmental practices are positively associated with high performance. However, based on the fsQCA, three configurations are identified in which OCM is associated with high performance. The most prevalent configuration combined supportive and developmental practices with I‐deals and individual responsibility for career management. We conclude with a discussion of the implications of our findings, and address the utility of adopting a configurational approach in career research. © 2016 Wiley Periodicals, Inc.  相似文献   

9.
Recent years have seen an explosion of activity in the field of functional data analysis (FDA), in which curves, spectra, images and so on are considered as basic functional data units. A central problem in FDA is how to fit regression models with scalar responses and functional data points as predictors. We review some of the main approaches to this problem, categorising the basic model types as linear, non‐linear and non‐parametric. We discuss publicly available software packages and illustrate some of the procedures by application to a functional magnetic resonance imaging data set.  相似文献   

10.
a semiparametric estimator for binary‐outcome sample‐selection models is proposed that imposes only single index assumptions on the selection and outcome equations without specifying the error term distribution. I adopt the idea in Lewbel (2000) using a ‘special regressor’ to transform the binary response Y so that the transformed Y becomes linear in the latent index, which then makes it possible to remove the selection correction term by differencing the transformed Y equation. There are various versions of the estimator, which perform differently trading off bias and variance. A simulation study is conducted, and then I apply the estimators to US presidential election data in 2008 and 2012 to assess the impact of racial prejudice on the elections, as a black candidate was involved for the first time ever in the US history.  相似文献   

11.
This paper examines the process linking high‐performance work systems (HPWS) and organisational ambidexterity both at the unit and firm level of analyses by integrating strategic HRM, human capital and social capital perspectives. Multisource and multilevel data from 2,887 employees and 536 managers of 58 banks was collected. Results revealed that firm‐level HPWS were positively related to unit‐level employee human capital. Unit‐level employee human capital partially mediated the relationship between firm‐level HPWS and unit organisational ambidexterity. Furthermore, firm‐level social climate moderated the effect of firm‐level HPWS on unit organisational ambidexterity through unit‐level employee human capital. This paper contributes to HPWS and ambidexterity research by revealing the impacts of firm‐level HPWS and mediating mechanisms, as well as identifying boundary conditions for pursuing unit‐level organisational ambidexterity.  相似文献   

12.
In this article, we investigate the behaviour of a number of methods for estimating the co‐integration rank in VAR systems characterized by heteroskedastic innovation processes. In particular, we compare the efficacy of the most widely used information criteria, such as Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) , with the commonly used sequential approach of Johansen [Likelihood‐based Inference in Cointegrated Vector Autoregressive Models (1996)] based around the use of either asymptotic or wild bootstrap‐based likelihood ratio type tests. Complementing recent work done for the latter in Cavaliere, Rahbek and Taylor [Econometric Reviews (2014) forthcoming], we establish the asymptotic properties of the procedures based on information criteria in the presence of heteroskedasticity (conditional or unconditional) of a quite general and unknown form. The relative finite‐sample properties of the different methods are investigated by means of a Monte Carlo simulation study. For the simulation DGPs considered in the analysis, we find that the BIC‐based procedure and the bootstrap sequential test procedure deliver the best overall performance in terms of their frequency of selecting the correct co‐integration rank across different values of the co‐integration rank, sample size, stationary dynamics and models of heteroskedasticity. Of these, the wild bootstrap procedure is perhaps the more reliable overall as it avoids a significant tendency seen in the BIC‐based method to over‐estimate the co‐integration rank in relatively small sample sizes.  相似文献   

13.
The human capital of a firm as manifested by employee knowledge and experience represents a key resource of a firm's capabilities. Prior empirical studies have found that firms composed of high levels of human capital experience superior firm performance. Human capital theory proposes that an individual's general or firm‐specific human capital is positively related to compensation. However, empirical studies examining firm‐specific human capital's association with higher employee compensation have been inconclusive. The current study proposes that firm‐specific human capital be categorized as task‐specific and non‐task‐specific. Employees accumulate task‐specific human capital through duties conducted in their current position. Non‐task‐specific human capital represents experiences gained in prior positions to an employee's current job within the firm. Utilizing human capital data from 38,390 employees representing 76 firms in the IT sector, this study examines the association between forms of human capital and employee compensation at different levels of firm productivity. Results show that task‐specific human capital is associated with higher employee compensation. In addition, firm productivity moderates this association.  相似文献   

14.
This paper considers estimation and inference in linear panel regression models with lagged dependent variables and/or other weakly exogenous regressors when N (the cross‐section dimension) is large relative to T (the time series dimension). It allows for fixed and time effects (FE‐TE) and derives a general formula for the bias of the FE‐TE estimator which generalizes the well‐known Nickell bias formula derived for the pure autoregressive dynamic panel data models. It shows that in the presence of weakly exogenous regressors inference based on the FE‐TE estimator will result in size distortions unless N/T is sufficiently small. To deal with the bias and size distortion of the FE‐TE estimator the use of a half‐panel jackknife FE‐TE estimator is considered and its asymptotic distribution is derived. It is shown that the bias of the half‐panel jackknife FE‐TE estimator is of order T?2, and for valid inference it is only required that N/T3→0, as N,T jointly. Extension to unbalanced panel data models is also provided. The theoretical results are illustrated with Monte Carlo evidence. It is shown that the FE‐TE estimator can suffer from large size distortions when N>T, with the half‐panel jackknife FE‐TE estimator showing little size distortions. The use of half‐panel jackknife FE‐TE estimator is illustrated with two empirical applications from the literature.  相似文献   

15.
This paper derives a procedure for simulating continuous non‐normal distributions with specified L‐moments and L‐correlations in the context of power method polynomials of order three. It is demonstrated that the proposed procedure has computational advantages over the traditional product‐moment procedure in terms of solving for intermediate correlations. Simulation results also demonstrate that the proposed L‐moment‐based procedure is an attractive alternative to the traditional procedure when distributions with more severe departures from normality are considered. Specifically, estimates of L‐skew and L‐kurtosis are superior to the conventional estimates of skew and kurtosis in terms of both relative bias and relative standard error. Further, the L‐correlation also demonstrated to be less biased and more stable than the Pearson correlation. It is also shown how the proposed L‐moment‐based procedure can be extended to the larger class of power method distributions associated with polynomials of order five.  相似文献   

16.
Structural vector autoregressive (SVAR) models have emerged as a dominant research strategy in empirical macroeconomics, but suffer from the large number of parameters employed and the resulting estimation uncertainty associated with their impulse responses. In this paper, we propose general‐to‐specific (Gets) model selection procedures to overcome these limitations. It is shown that single‐equation procedures are generally efficient for the reduction of recursive SVAR models. The small‐sample properties of the proposed reduction procedure (as implemented using PcGets) are evaluated in a realistic Monte Carlo experiment. The impulse responses generated by the selected SVAR are found to be more precise and accurate than those of the unrestricted VAR. The proposed reduction strategy is then applied to the US monetary system considered by Christiano, Eichenbaum and Evans (Review of Economics and Statistics, Vol. 78, pp. 16–34, 1996) . The results are consistent with the Monte Carlo and question the validity of the impulse responses generated by the full system.  相似文献   

17.
This paper deals with the issue of testing hypotheses in symmetric and log‐symmetric linear regression models in small and moderate‐sized samples. We focus on four tests, namely, the Wald, likelihood ratio, score, and gradient tests. These tests rely on asymptotic results and are unreliable when the sample size is not large enough to guarantee a good agreement between the exact distribution of the test statistic and the corresponding chi‐squared asymptotic distribution. Bartlett and Bartlett‐type corrections typically attenuate the size distortion of the tests. These corrections are available in the literature for the likelihood ratio and score tests in symmetric linear regression models. Here, we derive a Bartlett‐type correction for the gradient test. We show that the corrections are also valid for the log‐symmetric linear regression models. We numerically compare the various tests and bootstrapped tests, through simulations. Our results suggest that the corrected and bootstrapped tests exhibit type I probability error closer to the chosen nominal level with virtually no power loss. The analytically corrected tests as well as the bootstrapped tests, including the Bartlett‐corrected gradient test derived in this paper, perform with the advantage of not requiring computationally intensive calculations. We present a real data application to illustrate the usefulness of the modified tests.  相似文献   

18.
We use recent statistical tests, based on a ‘distance’ between the model and the Hansen–Jagannathan bound, to compute the rejection rates of true models. For asset‐pricing models with time‐separable preferences, the finite‐sample distribution of the test statistic associated with the risk‐neutral case is extreme, in the sense that critical values based on this distribution deliver type I errors no larger than intended—regardless of risk aversion or the rate of time preference. We also show that these maximal‐type‐I‐error critical values are appropriate for both time and state non‐separable preferences and that they yield acceptably small type II error rates. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

19.
Despite the solid theoretical foundation on which the gravity model of bilateral trade is based, empirical implementation requires several assumptions which do not follow directly from the underlying theory. First, unobserved trade costs are assumed to be a (log‐)linear function of observables. Second, the effects of trade costs on trade flows are assumed to be constant across country pairs. Maintaining consistency with the underlying theory, but relaxing these assumptions, we estimate gravity models—in levels and logs—using two data sets via nonparametric methods. The results are striking. Despite the added flexibility of the nonparametric models, parametric models based on these assumptions offer equally or more reliable in‐sample predictions and out‐of‐sample forecasts in the majority of cases, particularly in the levels model. Moreover, formal statistical tests fail to reject either parametric functional form. Thus, concerns in the gravity literature over functional form appear unwarranted, and estimation of the gravity model in levels is recommended. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

20.
Although networks have long governed economic relations, they assume even more importance in a knowledge‐based economy. Yet, some argue that because of the lack of social networks and human capital, some groups are permanently ‘switched off’ the networks of the global economy. Evidence presented in this article suggests that instead there is latent potential for access to the network, due to the rise of networked community‐based organizations and the increasing accessibility of technology. Based on surveys and in‐depth interviews with almost 700 workers and training providers, I show how the switched off are entering jobs in information technology through network ties and the acquisition of soft skills, or communication and interaction skills. Although community‐based training providers are best positioned to help disadvantaged jobseekers enter the network society, changes in the US workforce development system are reinforcing network exclusivity, rather than facilitating this upward mobility.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号