首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2785篇
  免费   152篇
  国内免费   2篇
财政金融   625篇
工业经济   260篇
计划管理   424篇
经济学   471篇
综合类   18篇
运输经济   61篇
旅游经济   108篇
贸易经济   601篇
农业经济   82篇
经济概况   287篇
信息产业经济   1篇
邮电经济   1篇
  2023年   32篇
  2022年   20篇
  2021年   31篇
  2020年   62篇
  2019年   113篇
  2018年   137篇
  2017年   133篇
  2016年   99篇
  2015年   70篇
  2014年   109篇
  2013年   381篇
  2012年   142篇
  2011年   145篇
  2010年   119篇
  2009年   116篇
  2008年   126篇
  2007年   75篇
  2006年   72篇
  2005年   69篇
  2004年   59篇
  2003年   48篇
  2002年   53篇
  2001年   57篇
  2000年   64篇
  1999年   50篇
  1998年   53篇
  1997年   31篇
  1996年   30篇
  1995年   39篇
  1994年   29篇
  1993年   20篇
  1992年   20篇
  1991年   21篇
  1990年   24篇
  1989年   13篇
  1988年   18篇
  1987年   15篇
  1986年   24篇
  1985年   26篇
  1984年   22篇
  1983年   15篇
  1982年   19篇
  1981年   14篇
  1980年   17篇
  1979年   15篇
  1978年   16篇
  1977年   11篇
  1976年   11篇
  1973年   9篇
  1971年   11篇
排序方式: 共有2939条查询结果,搜索用时 546 毫秒
921.
A stochastic version of single-source capacitated facility location problem is considered. A set of capacitated facilities is to be selected to provide service to demand points with stochastic demand at the minimal total cost. The facilities have service level requirements modeled by chance constraints. For Poisson demand, the problem is proved equivalent to a known solvable deterministic problem. For Normally distributed demand, it is equivalent to a deterministic mixed integer non-linear programming problem. A hybrid heuristic of Lagrangean relaxation with a single-customer-multi-exchange heuristic is embedded within a branch-and-bound framework to find upper and lower bounds for this problem. From test instances created from benchmark problems (10–20 facilities and 50 demand nodes) and real-life data on the deterministic problem, the gap between the bounds is within 6.5% with an average of 2.5%.  相似文献   
922.
In missing data problems, it is often the case that there is a natural test statistic for testing a statistical hypothesis had all the data been observed. A fuzzy  p -value approach to hypothesis testing has recently been proposed which is implemented by imputing the missing values in the "complete data" test statistic by values simulated from the conditional null distribution given the observed data. We argue that imputing data in this way will inevitably lead to loss in power. For the case of scalar parameter, we show that the asymptotic efficiency of the score test based on the imputed "complete data" relative to the score test based on the observed data is given by the ratio of the observed data information to the complete data information. Three examples involving probit regression, normal random effects model, and unidentified paired data are used for illustration. For testing linkage disequilibrium based on pooled genotype data, simulation results show that the imputed Neyman Pearson and Fisher exact tests are less powerful than a Wald-type test based on the observed data maximum likelihood estimator. In conclusion, we caution against the routine use of the fuzzy  p -value approach in latent variable or missing data problems and suggest some viable alternatives.  相似文献   
923.
To foresee the advent of new technologies and their socio-economic impact is a necessity for academia, governments and private enterprises as well. In the future studies, the identification of future signal is one of the renowned techniques for analysis of trends, emerging issue, and gaining future insights. In the Big Data era, recent scholars have proposed using a text mining procedure focusing upon web data such as new social media and academic papers. However, the detection of future signals is still under a developing area of research, and there is much to improve existing methodology as well as developing theoretical foundations. The present study reviews previous literature on identifying emerging issue based on the weak signal detection approach. Then the authors proposed a revised framework that incorporate quantitative and qualitative text mining for assessing the strength of future signals. The authors applied the framework to the case study on the ethical issues of artificial intelligence (hereafter AI). From EBSCO host database, the authors collected text data covering the ethical issues in AI and conducted text mining analysis. Results reveal that emerging ethical issues can be classified as strong signal, weak signal, well-known but not so strong signal, and latent signal. The revised methodology will be able to provide insights for government and business stakeholders by identifying the future signals and their meanings in various fields.  相似文献   
924.
This paper addresses how to use simple access pricing to promote entry in the presence of spillovers and essential facilities. An inherent dilemma is that the greater the spillover, the greater is the benefit of attracting good entrants but the harder it is to exclude low-quality entrants because they can free ride on the quality of good entrants. We show that the inability to discriminate between entrants has the effect of discouraging access prices that enable the low-quality entrant to enter, generates a time consistency problem, and provides incentives for good-quality entrants to limit their competitiveness.  相似文献   
925.
We review strategies that movie distributors have used to cope with piracy, copying, and sharing of movies in the United States in four categories: “hard goods” commercial piracy, consumer theft of pay TV signals, consumer copying and sharing of videos and pay TV, and (mostly in prospect) Internet file sharing. In the past, distributors have mainly sought to raise costs of engaging in these activities by increasing legal jeopardy, advantaging anti-copy technology, and reducing original sources of supply. They appear to have effectively reduced or contained most piracy, copying, and sharing of movies in the U.S., at least with analog media. Movie distributors are following similar strategies with digital media, including Internet file sharing. Digital media raise the stakes because of lower costs of copying or sharing and higher quality of outputs. Digital outputs are not always as high quality as source originals, however, and digital rights management (DRM) technologies potentially improve distributor control. The movie studios now face technological, demand, and political uncertainties in the U.S., notably in maintaining or achieving technically compatible DRM systems to control file sharing and PPV/VOD copying. Implications for foreign markets and directions for research are discussed. Some sections of this article draw substantially on David Waterman (2005a). Hollywood’s road to riches (Cambridge, MA: Harvard University Press).  相似文献   
926.
Recent variable annuities offer participation in the equity market and attractive protection against downside movements. Accurately quantifying this additional equity market risk and robustly hedging options embedded in the guarantees of variable annuities are new challenges for insurance companies. Due to sensitivities of the benefits to tails of the account value distribution, a simple Black–Scholes model is inadequate in preventing excessive liabilities. A model which realistically describes the real world price dynamics over a long time horizon is essential for the risk management of the variable annuities. In this article, both jump risk and volatility risk are considered for risk management of lookback options embedded in guarantees with a ratchet feature. We evaluate relative performances of delta hedging and dynamic discrete risk minimization hedging strategies. Using the underlying as the hedging instrument, we show that, under a Black–Scholes model, local risk minimization hedging can be significantly better than delta hedging. In addition, we compare risk minimization hedging using the underlying with that of using standard options. We demonstrate that, under a Merton's jump diffusion model, hedging using standard options is superior to hedging using the underlying in terms of the risk reduction. Finally, we consider a market model for volatility risks in which the at‐the‐money implied volatility is a state variable. We compute risk minimization hedging by modeling at‐the‐money Black–Scholes implied volatility explicitly; the hedging effectiveness is evaluated, however, under a joint model for the underlying price and implied volatility. Our computational results suggest that, when implied volatility risk is suitably modeled, risk minimization hedging using standard options, compared to hedging using the underlying, can potentially be more effective in risk reduction under both jump and volatility risks.  相似文献   
927.
Typically, research on organizational learning has been conceptual in nature. In a departure from this tradition, we develop and test a structural model of organizational learning in the context of the purchasing of an expensive and complex product in the information technology (IT) area. The key focus of our research is the participation of external IT consultants and our model links seven explanatory constructs that are consistent with the process school of thought in organizational learning. More specifically, two organizational variables-formalization, strategic importance-and two individual-level variables-stakeholding, prior experience-are viewed as antecedents of consultant participation. In contrast, we view internal search effort, external search effort, and organizational learning as consequences of consultant participation. As predicted, all four antecedent variables affected consultant participation. Moreover, we found that, while consultant participation had a positive impact on internal search effort and organizational learning, its impact on external information search effort was negative.  相似文献   
928.
This paper studies a unique buyback method allowing firms toreacquire their own shares on a separate trading line whereonly the firm is allowed to buy shares. This share repurchasemethod is called the Second Trading Line and has been extensivelyused by Swiss companies since 1997. This type of repurchaseis unique for two reasons. First, unlike open market programs,the repurchasing company does not trade anonymously. Second,all transactions made by the repurchasing firm are publiclyavailable in real time to every market participant. This isa case of instantaneous disclosure which contrasts sharply withother markets characterized by delayed or no disclosure. Wedocument that the daily repurchase decision is statisticallyassociated with short-term price changes and the release offirm-specific news. We also find that repurchases on the secondtrading line have a beneficial impact on the liquidity of repurchasingfirms. Exchanges and regulators may consider the second tradingline an attractive share reacquisition mechanism because ofits transparency and positive liquidity effects.  相似文献   
929.
930.
Constructing a database of 37 industries, we examine whether the measured productivity in Japan is pro‐cyclical and investigate the sources of this pro‐cyclicality by using the production function approach employed by Hall (1990) and Basu and Fernald (1995). The aggregate Solow residual displays pro‐cyclicality. A large number of industries show constant returns to scale. No significant evidence for the presence of thick‐market externalities is found. Our results also hold when we consider labour hoarding, part‐time employment, and the adjustment cost of investment. The results indicate that policies to revitalize the Japanese economy should concentrate on promoting productivity growth.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号