全文获取类型
收费全文 | 14966篇 |
免费 | 364篇 |
国内免费 | 13篇 |
专业分类
财政金融 | 2662篇 |
工业经济 | 1173篇 |
计划管理 | 2540篇 |
经济学 | 3093篇 |
综合类 | 309篇 |
运输经济 | 143篇 |
旅游经济 | 329篇 |
贸易经济 | 2699篇 |
农业经济 | 838篇 |
经济概况 | 1531篇 |
信息产业经济 | 1篇 |
邮电经济 | 25篇 |
出版年
2021年 | 97篇 |
2020年 | 172篇 |
2019年 | 257篇 |
2018年 | 318篇 |
2017年 | 364篇 |
2016年 | 326篇 |
2015年 | 229篇 |
2014年 | 364篇 |
2013年 | 1442篇 |
2012年 | 410篇 |
2011年 | 438篇 |
2010年 | 419篇 |
2009年 | 432篇 |
2008年 | 489篇 |
2007年 | 456篇 |
2006年 | 379篇 |
2005年 | 375篇 |
2004年 | 334篇 |
2003年 | 341篇 |
2002年 | 315篇 |
2001年 | 311篇 |
2000年 | 304篇 |
1999年 | 288篇 |
1998年 | 279篇 |
1997年 | 280篇 |
1996年 | 284篇 |
1995年 | 251篇 |
1994年 | 244篇 |
1993年 | 272篇 |
1992年 | 251篇 |
1991年 | 240篇 |
1990年 | 225篇 |
1989年 | 200篇 |
1988年 | 158篇 |
1987年 | 194篇 |
1986年 | 187篇 |
1985年 | 279篇 |
1984年 | 286篇 |
1983年 | 277篇 |
1982年 | 247篇 |
1981年 | 244篇 |
1980年 | 206篇 |
1979年 | 213篇 |
1978年 | 157篇 |
1977年 | 164篇 |
1976年 | 143篇 |
1975年 | 110篇 |
1974年 | 108篇 |
1973年 | 102篇 |
1972年 | 67篇 |
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
951.
Stephen H Schneider B. L. Turner Holly Morehouse Garriga 《Journal of Risk Research》2013,16(2):165-185
Decisionmakers at all scales (individuals, firms, and local, national, and international governmental organizations) are concerned about reducing their vulnerability to (or the likelihood of) unexpected events, 'surprises.' After briefly and selectively reviewing the literature on uncertainty and surprise, we adopt a definition of 'surprise' that does not include the strict requirement that it apply to a wholly unexpected outcome, but rather recognizes that many events are often anticipated by some, even if not most observers. Thus, we define 'imaginable surprise' as events or processes that depart from the expectations of some definable community. Therefore, what gets labelled as 'surprise' depends on the extent to which what happens departs from community expectations and on the salience of the problem. We offer a typology of surprise that distinguishes imaginable surprises from risk and uncertainty, and develops several kinds of impediments to overcoming ignorances. These range from the need for more 'normal science' to phenomenological impediments (e.g., inherentunpredictability in some chaotic systems) to epistemological ignorance (e.g., ideological blocks to reducing ignorance). Based on the input of some two dozen scholars at an Aspen Global Change Institute Summer Workshop in 1994 *, we construct two tables in which participants offer many possible 'imaginable surprises' in the global change context, as well as their potential salience for creating unexpectedly high or low carbon dioxide emissions. Improving the anticipation of surprises is an interdisciplinary enterprise that should offer a sceptical welcoming of outlier ideas and methods. 相似文献
952.
It is widely believed that fluctuations in transaction volume, as reflected in the number of transactions and to a lesser extent their size, are the main cause of clustered volatility. Under this view bursts of rapid or slow price diffusion reflect bursts of frequent or less frequent trading, which cause both clustered volatility and heavy tails in price returns. We investigate this hypothesis using tick by tick data from the New York and London Stock Exchanges and show that only a small fraction of volatility fluctuations are explained in this manner. Clustered volatility is still very strong even if price changes are recorded on intervals in which the total transaction volume or number of transactions is held constant. In addition the distribution of price returns conditioned on volume or transaction frequency being held constant is similar to that in real time, making it clear that neither of these are the principal cause of heavy tails in price returns. We analyse recent results of Ane and Geman (2000: J. Finance, 55, 2259–2284) and Gabaix et al. (2003: Nature, 423, 267–270), and discuss the reasons why their conclusions differ from ours. Based on a cross-sectional analysis we show that the long-memory of volatility is dominated by factors other than transaction frequency or total trading volume. 相似文献
953.
Christian L. Dunis Jason Laws Andreas Karathanasopoulos 《European Journal of Finance》2013,19(3):180-205
In the current paper, we present an integrated genetic programming (GP) environment called java GP modelling. The java GP modelling environment is an implementation of the steady-state GP algorithm. This algorithm evolves tree-based structures that represent models of inputs and outputs. The motivation of this paper is to compare the GP algorithm with neural network (NN) architectures when applied to the task of forecasting and trading the ASE 20 Greek Index (using autoregressive terms as inputs). This is done by benchmarking the forecasting performance of the GP algorithm and six different autoregressive moving average model (ARMA) NN combination designs representing a Hybrid, Mixed Higher Order Neural Network (HONN), a Hybrid, Mixed Recurrent Neural Network (RNN), a Hybrid, Mixed classic Multilayer Perceptron with some traditional techniques, either statistical such as a an ARMA or technical such as a moving average convergence/divergence model, and a naïve trading strategy. More specifically, the trading performance of all models is investigated in a forecast and trading simulation on ASE 20 time-series closing prices over the period 2001–2008, using the last one and a half years for out-of-sample testing. We use the ASE 20 daily series as many financial institutions are ready to trade at this level, and it is therefore possible to leave orders with a bank for business to be transacted on that basis. As it turns out, the GP model does remarkably well and outperforms all other models in a simple trading simulation exercise. This is also the case when more sophisticated trading strategies using confirmation filters and leverage are applied, as the GP model still produces better results and outperforms all other NN and traditional statistical models in terms of annualized return. 相似文献
954.
B.R. Orton L. Sjöberg J. Jung D. ürge-Vorsatz M. Tamássyné-Bíró 《Journal of Risk Research》2013,16(1):17-29
Perceptions of risks from two groups of industrial radiographers, one from Hungary, (n = 45) and from the United Kingdom, (n = 29) were compared by the psychometric method. The comparison was made because both groups were at risk for high doses of ionizing radiation. We found the groups had similar demographic profiles but poor socio-economic conditions of Hungarians were associated with higher levels of emotional distress. Correlation HU-UK for personal and general risks were at a significant level for topics that included lifestyle and radiation risks. Perceptions of risks from radiation were small except for large personal risk from East European nuclear power plants. Knowledge of radiation risk intranationally was correlated positively with personal risk for UK radiographers and negatively for Hungarians. However, average overall risk perceptions from the same topic list for all radiographers did not differ significantly from a group (n = 1461) of UK citizens, though radiographer's risks from radiation were considerably greater. As a new lifesaving intervention it was proposed that radiation risk reduction could be achieved by genetic testing. 相似文献
955.
Michael L. Dekay Mitchell J. Small Paul S. Fischbeck R. Scott Farrow Alison Cullen Joseph B. Kadane 《Journal of Risk Research》2013,16(4):391-417
A decision-analytic model for avoiding a risky activity is presented. The model considers the benefit and cost of avoiding the activity, the probability that the activity is unsafe, and scientific tests or studies that could be conducted to revise the probability that the activity is unsafe. For a single decision maker, thresholds are identified for his or her current subjective probability that the activity is unsafe. These thresholds indicate whether the preferred course of action is avoiding the activity without further study, engaging in the activity without further study, or conducting a test or research programme to obtain additional information and following the result. When these thresholds are low, precautionary action is more likely to be warranted. When there are multiple stakeholders, differences in their perceptions of the benefit and cost of avoidance and differences in their perceptions of the accuracy of the additional information provided by the test or research programme combine to create differences in their decision thresholds. Thus, the model allows for the rational expression of differences among parties in a way that highlights disagreements and possible paths to conflict resolution. The model is illustrated with an application to phytosanitary standards in international trade and examined in terms of recent empirical research on lay perceptions of risks, benefits, and trust. Further research is suggested to improve the elicitation of model components, as a way of fostering the legitimate application of risk-based decision analysis in precautionary policy making. 相似文献
956.
Nicolás C. Bronfman Esperanza López Vázquez Virna Vaneza Gutiérrez Luis Abdón Cifuentes 《Journal of Risk Research》2013,16(6):755-773
Studies over the past decade have found empirical links between trust in risk management institutions and the risk perceptions and acceptability of various individual hazards. Mostly addressing food technologies, no study to date has explored wider possible relationships among all four core variables (risk, benefit, trust and acceptability) covering a heterogeneous group of hazards. Our prime objective was to ascertain effects among social trust in regulatory entities, and the public's perceived risk, perceived benefit and the degree of acceptability towards both technological and environmental hazards. We also assess whether trust in regulatory authorities is the cause (causal model) or a consequence (associationist model) of a hazard's acceptability for a wide and heterogeneous range of hazards on all four core variables. Using a web‐based survey, 539 undergraduates in Chile rated the five variables across 30 hazards. Implications for technology and environmental risk management organizations are discussed. Independent of the magnitude of the perceived risk or benefit surrounding a given hazard, or how knowledgeable the public claim to be of it, the trust sustained in regulatory institutions will either generate or be the consequence of public attitudes towards the hazard. 相似文献
957.
In 2003, the European Commission (EC) started using Impact Assessment (IA) as the main empirical basis for its major policy proposals. The aim was to systematically assess ex ante the economic, social and environmental impacts of European Union (EU) policy proposals. In parallel, research proliferated in search for theoretical grounds for IAs and in an attempt to evaluate empirically the performance of the first sets of IAs produced by the EC. This paper combines conceptual and evaluative studies carried out in the first five years of EU IAs. It concludes that the great discrepancy between rationale and practice calls for a different theoretical focus and a higher emphasis on evaluating empirically crucial risk economics aspects of IAs, such as the value of statistical life, price of carbon, the integration of macroeconomic modelling and scenario analysis. 相似文献
958.
Brian A. Rutherford 《The British Accounting Review》2013,45(4):297-310
A genre is a category of texts marked out by the conventions employed in their production. A genre-theoretic approach draws out the complex, subtle and elusive nature of financial reporting as communication. It provides scope for examining the features of the reporting process that contribute to its complexities and subtleties in a systematic, comprehensive and integrated way, embracing both technical and social dimensions. This paper discusses aspects of genre theory, as employed in discourse analysis, and their application to financial reporting. Relevant features of the approach include financial statement composition as a challenging process; knowing users; an engaged discourse community; situated communication; intertextuality; and structural dynamism. A genre-based approach has a number of implications for financial reporting research, at both methodological and substantive levels, which are explored in the paper, and may ultimately offer the potential for integrating market-based and interdisciplinary work together with the best of the classical tradition. 相似文献
959.
In this paper, we propose a co-integration model with a logistic mixture auto-regressive equilibrium error (co-integrated LMAR), in which the equilibrium relationship among cumulative returns of different financial assets is modelled by a logistic mixture autoregressive time series model. The traditional autoregression (AR) based unit root test (ADF test), used in testing co-integration, cannot give a sound explanation when a time series passes the ADF test. However, its largest root in the AR polynomial is extremely close to, but less than, one, which is most likely the result of a mixture of random-walk and mean-reverting processes in the time series data. With this background, we put an LMAR model into the co-integration framework to identify baskets that have a large spread but are still well co-integrated. A sufficient condition for the stationarity of the LMAR model is given and proved using a Markovian approach. A two-step estimating procedure, combining least-squares estimation and the Expectation-Maximization (EM) algorithm, is given. The Bayesian information criterion (BIC) is used in model selection. The co-integrated LMAR model is applied to basket trading, which is a widely used tool for arbitrage. We use simulation to assess the model in basket trading strategies with the statistical arbitrage feature in equity markets. Data from several sectors of the Hong Kong Hang Seng Index are used in a simulation study on basket trading. Empirical results show that a portfolio using the co-integrated LMAR model has a higher return than portfolios selected by traditional methods. Although the volatility in the return increases, the Sharpe ratio also increases in most cases. This risk–return profile can be explained by the shorter converging period in the co-integrated LMAR model and the larger volatility in the ‘mean-reverting’ regime. 相似文献
960.
This paper considers the problem of pricing American options when the dynamics of the underlying are driven by both stochastic volatility following a square-root process as used by Heston [Rev. Financial Stud., 1993, 6, 327–343], and by a Poisson jump process as introduced by Merton [J. Financial Econ., 1976, 3, 125–144]. Probability arguments are invoked to find a representation of the solution in terms of expectations over the joint distribution of the underlying process. A combination of Fourier transform in the log stock price and Laplace transform in the volatility is then applied to find the transition probability density function of the underlying process. It turns out that the price is given by an integral dependent upon the early exercise surface, for which a corresponding integral equation is obtained. The solution generalizes in an intuitive way the structure of the solution to the corresponding European option pricing problem obtained by Scott [Math. Finance, 1997, 7(4), 413–426], but here in the case of a call option and constant interest rates. 相似文献