首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4245篇
  免费   117篇
  国内免费   14篇
财政金融   978篇
工业经济   149篇
计划管理   1100篇
经济学   939篇
综合类   109篇
运输经济   95篇
旅游经济   85篇
贸易经济   451篇
农业经济   218篇
经济概况   252篇
  2023年   85篇
  2022年   37篇
  2021年   117篇
  2020年   223篇
  2019年   172篇
  2018年   150篇
  2017年   201篇
  2016年   191篇
  2015年   128篇
  2014年   248篇
  2013年   536篇
  2012年   158篇
  2011年   247篇
  2010年   144篇
  2009年   206篇
  2008年   215篇
  2007年   189篇
  2006年   200篇
  2005年   144篇
  2004年   115篇
  2003年   94篇
  2002年   86篇
  2001年   83篇
  2000年   57篇
  1999年   76篇
  1998年   47篇
  1997年   54篇
  1996年   31篇
  1995年   28篇
  1994年   19篇
  1993年   15篇
  1992年   12篇
  1991年   12篇
  1990年   5篇
  1988年   5篇
  1987年   4篇
  1986年   3篇
  1985年   14篇
  1984年   13篇
  1983年   6篇
  1982年   3篇
  1981年   1篇
  1980年   1篇
  1979年   1篇
排序方式: 共有4376条查询结果,搜索用时 78 毫秒
31.
Competition in the long-distance market in the US continues to intensify; the 1996 Telecommunications Act has led to increased competition in long-distance telephony especially as the Regional Bell Operating Companies have begun to gain entry to long-haul, long-distance markets. In order to better understand the implications of having increased service offerings, models of how customers choose between carriers (and the impact of this choice on subsequent usage) will be useful. We develop the first publicly available models that simultaneously estimate choice and usage for intraLATA long-distance in the US. Utilizing a generalized Tobit model, the price responsiveness of usage and carrier choice are estimated. The results are generally consistent with expectations both in terms of theory and of practical experience in the industry.  相似文献   
32.
This paper presents and analyses the differences in the eco-models implemented worldwide (such as whether and how carbon taxes being "recycled"), or in their efficiency parameters (inconsistent parameter values that account for different results). This is the assumption that a real tradeoff exists between the production of environmental goods. The present article empirically proves that something must be given up in order to gain something else, and once equations are specified to trace out the path of the economy over time, the natural economic formulation of such equations will embody the notion of economic and bio-tradeoffs.  相似文献   
33.
34.
We consider the normalized least squares estimator of the parameter in a nearly integrated first-order autoregressive model with dependent errors. In a first step we consider its asymptotic distribution as well as asymptotic expansion up to order Op(T−1). We derive a limiting moment generating function which enables us to calculate various distributional quantities by numerical integration. A simulation study is performed to assess the adequacy of the asymptotic distribution when the errors are correlated. We focus our attention on two leading cases: MA(1) errors and AR(1) errors. The asymptotic approximations are shown to be inadequate as the MA root gets close to −1 and as the AR root approaches either −1 or 1. Our theoretical analysis helps to explain and understand the simulation results of Schwert (1989) and DeJong, Nankervis, Savin, and Whiteman (1992) concerning the size and power of Phillips and Perron's (1988) unit root test. A companion paper, Nabeya and Perron (1994), presents alternative asymptotic frameworks in the cases where the usual asymptotic distribution fails to provide an adequate approximation to the finite-sample distribution.  相似文献   
35.
An extensive collection of continuous-time models of the short-term interest rate is evaluated over data sets that have appeared previously in the literature. The analysis, which uses the simulated maximum likelihood procedure proposed by Durham and Gallant (2002), provides new insights regarding several previously unresolved questions. For single factor models, I find that the volatility, not the drift, is the critical component in model specification. Allowing for additional flexibility beyond a constant term in the drift provides negligible benefit. While constant drift would appear to imply that the short rate is nonstationary, in fact, stationarity is volatility-induced. The simple constant elasticity of volatility model fits weekly observations of the three-month Treasury bill rate remarkably well but is easily rejected when compared with more flexible volatility specifications over daily data. The methodology of Durham and Gallant can also be used to estimate stochastic volatility models. While adding the latent volatility component provides a large improvement in the likelihood for the physical process, it does little to improve bond-pricing performance.  相似文献   
36.
Multinational companies face increasing risks arising from external risk factors, e.g. exchange rates, interest rates and commodity prices, which they have learned to hedge using derivatives. However, despite increasing disclosure requirements, a firm's net risk profile may not be transparent to shareholders. We develop the ‘Component Value‐at‐Risk (VaR)’ framework for companies to identify the multi‐dimensional downside risk profile as perceived by shareholders. This framework allows for decomposing downside risk into components that are attributable to each of the underlying risk factors. The firm can compare this perceived VaR, including its composition and dynamics, to an internal VaR based on net exposures as it is known to the company. Any differences may lead to surprises at times of earnings announcements and thus constitute a litigation threat to the firm. It may reduce this information asymmetry through targeted communication efforts.  相似文献   
37.
In the presented text the authors judge the importance of statistics in the monetary policy of the Czech National Bank (CNB) over the course of the economic transformation process, with particular consideration of changing statistical needs and the possibilities and limits of statistical data exploitation in the monetary analyses. The importance of statistics lies on the level of collection and processing of statistical information and on the level of use of statistical methods to analyse data. Since the start of the 1990s the requirements for statistics were significantly influenced by monetary policy. In the period 1990–1997, monetary targeting was the primary influential factor. Since 1998, the monetary policy is influenced by inflation targeting. Statistical priorities switched from monetary data to economy and financial market data. Much progress has been made in the use of statistical methods for analysing data. Statistics available at present cover the CNB's standard monetary-policy requirements and are on par with those in developed countries. Its further development will reflect the standard changes taking place in the more advanced countries.  相似文献   
38.
Simpler Probabilistic Population Forecasts: Making Scenarios Work   总被引:1,自引:0,他引:1  
The traditional high-low-medium scenario approach to quantifying uncertainty in population forecasts has been criticized as lacking probabilistic meaning and consistency. This paper shows, under certain assumptions, how appropriately calibrated scenarios can be used to approximate the uncertainty intervals on future population size and age structure obtained with fully stochastic forecasts. As many forecasting organizations already produce scenarios and because dealing with them is familiar territory, the methods presented here offer an attractive intermediate position between probabilistically inconsistent scenario analysis and fully stochastic forecasts.  相似文献   
39.
In this study we propose a mathematical definition of the consumption efficiency of multi-attribute products in the price–quality space. A new model, the discrete Range Adjusted Measure (RAM) model, is suggested as an empirical tool to measure the level of consumption efficiency. We further discuss the effect of consumption efficiency on the innovation incentive. Empirical work is made for the mobile phone market. We expect that the consumption efficiency concept will contribute to the extension of the traditional framework of production efficiency analysis on the one hand and to the understanding of the nature of innovation in a technology-intensive market on the other hand.JEL Classification: C67, D11, D12, D21  相似文献   
40.
Bentler and Raykov (2000, Journal of Applied Psychology 85: 125–131), and Jöreskog (1999a, http://www.ssicentral.com/lisrel/column3.htm, 1999b http://www.ssicentral. com/lisrel/column5.htm) proposed procedures for calculating R 2 for dependent variables involved in loops or possessing correlated errors. This article demonstrates that Bentler and Raykov’s procedure can not be routinely interpreted as a “proportion” of explained variance, while Jöreskog’s reduced-form calculation is unnecessarily restrictive. The new blocked-error-R 2 (beR 2) uses a minimal hypothetical causal intervention to resolve the variance-partitioning ambiguities created by loops and correlated errors. Hayduk (1996) discussed how stabilising feedback models – models capable of counteracting external perturbations – can result in an acceptable error variance which exceeds the variance of the dependent variable to which that error is attached. For variables included within loops, whether stabilising or not, beR 2 provides the same value as Hayduk’s (1996) loop-adjusted-R 2. For variables not involved in loops and not displaying correlated residuals, beR 2 reports the same value as the traditional regression R 2. Thus, beR 2 provides a conceptualisation of the proportion of explained variance that spans both recursive and nonrecursive structural equation models. A procedure for calculating beR 2 in any SEM program is provided.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号