首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   815篇
  免费   15篇
财政金融   176篇
工业经济   16篇
计划管理   162篇
经济学   167篇
综合类   4篇
运输经济   45篇
旅游经济   40篇
贸易经济   88篇
农业经济   110篇
经济概况   22篇
  2024年   3篇
  2023年   15篇
  2022年   7篇
  2021年   21篇
  2020年   54篇
  2019年   44篇
  2018年   34篇
  2017年   58篇
  2016年   55篇
  2015年   21篇
  2014年   42篇
  2013年   176篇
  2012年   17篇
  2011年   27篇
  2010年   11篇
  2009年   39篇
  2008年   34篇
  2007年   26篇
  2006年   20篇
  2005年   20篇
  2004年   12篇
  2003年   12篇
  2002年   9篇
  2001年   13篇
  2000年   6篇
  1999年   5篇
  1998年   6篇
  1997年   6篇
  1996年   9篇
  1995年   6篇
  1994年   1篇
  1993年   2篇
  1992年   4篇
  1991年   3篇
  1990年   1篇
  1988年   1篇
  1985年   2篇
  1984年   4篇
  1983年   1篇
  1982年   3篇
排序方式: 共有830条查询结果,搜索用时 0 毫秒
61.
Finite difference methods are a popular technique for pricing American options. Since their introduction to finance by Brennan and Schwartz their use has spread from vanilla calls and puts on one stock to path-dependent and exotic options on multiple assets. Despite the breadth of the problems they have been applied to, and the increased sophistication of some of the newer techniques, most approaches to pricing equity options have not adequately addressed the issues of unbounded computational domains and divergent diffusion coefficients. In this article it is shown that these two problems are related and can be overcome using multiple grids. This new technique allows options to be priced for all values of the underlying, and is illustrated using standard put options and the call on the maximum of two stocks. For the latter contract, I also derive a characterization of the asymptotic continuation region in terms of a one-dimensional option pricing problem, and give analytic formulae for the perpetual case.  相似文献   
62.
In the current paper, we present an integrated genetic programming (GP) environment called java GP modelling. The java GP modelling environment is an implementation of the steady-state GP algorithm. This algorithm evolves tree-based structures that represent models of inputs and outputs. The motivation of this paper is to compare the GP algorithm with neural network (NN) architectures when applied to the task of forecasting and trading the ASE 20 Greek Index (using autoregressive terms as inputs). This is done by benchmarking the forecasting performance of the GP algorithm and six different autoregressive moving average model (ARMA) NN combination designs representing a Hybrid, Mixed Higher Order Neural Network (HONN), a Hybrid, Mixed Recurrent Neural Network (RNN), a Hybrid, Mixed classic Multilayer Perceptron with some traditional techniques, either statistical such as a an ARMA or technical such as a moving average convergence/divergence model, and a naïve trading strategy. More specifically, the trading performance of all models is investigated in a forecast and trading simulation on ASE 20 time-series closing prices over the period 2001–2008, using the last one and a half years for out-of-sample testing. We use the ASE 20 daily series as many financial institutions are ready to trade at this level, and it is therefore possible to leave orders with a bank for business to be transacted on that basis. As it turns out, the GP model does remarkably well and outperforms all other models in a simple trading simulation exercise. This is also the case when more sophisticated trading strategies using confirmation filters and leverage are applied, as the GP model still produces better results and outperforms all other NN and traditional statistical models in terms of annualized return.  相似文献   
63.
This article is the second part of a review of recent empirical and theoretical developments usually grouped under the heading Econophysics. In the first part, we reviewed the statistical properties of financial time series, the statistics exhibited in order books and discussed some studies of correlations of asset prices and returns. This second part deals with models in Econophysics from the point of view of agent-based modeling. Of the large number of multi-agent-based models, we have identified three representative areas. First, using previous work originally presented in the fields of behavioral finance and market microstructure theory, econophysicists have developed agent-based models of order-driven markets that we discuss extensively here. Second, kinetic theory models designed to explain certain empirical facts concerning wealth distribution are reviewed. Third, we briefly summarize game theory models by reviewing the now classic minority game and related problems.  相似文献   
64.
We compare several parametric and non-parametric approaches for modelling variance swap curves by conducting an in-sample and an out-of-sample analysis using market prices. The forecasted Heston model gives the best overall performance. Moreover, the static Heston model highlights some problems of stochastic volatility models in option pricing of forward starting products.  相似文献   
65.
We propose a new model of the liquidity-driven banking system focusing on overnight interbank loans. This significant branch of the interbank market is commonly neglected in the banking system modelling and systemic risk analysis. We construct a model where banks are allowed to use both the interbank and the securities markets to manage their liquidity demand and supply as driven by prudential requirements in a volatile environment. The network of interbank loans is dynamic and simulated every day. We show how the intrasystem cash fluctuations alone, without any external shocks, may lead to systemic defaults, and what may be a symptom of the self-organized criticality of the system. We also analyze the impact of different prudential regulations and market conditions on the interbank market resilience. We confirm that the central bank’s asset purchase programmes, limiting the declines in government bond prices, can successfully stabilize banks’ liquidity demands. The model can be used to analyze the interbank market impact of macroprudential tools.  相似文献   
66.
Recently, Cooray & Ananda (2005) proposed a composite lognormal-Pareto model for use with loss payments data of the sort arising in the actuarial and insurance industries. Their model is based on a lognormal density up to an unknown threshold value and a two-parameter Pareto density thereafter. Here we identify and discuss limitations of this composite lognormal-Pareto model which are likely to severely curtail its potential for practical application to real world data sets. In addition, we present two different composite models based on lognormal and Pareto models in order to address these concerns. The performance of all three composite models is discussed and compared in the context of an example based upon a well-known fire insurance data set.  相似文献   
67.
A model for the statistical analysis of the total amount of insurance paid out on a policy is developed and applied. The model simultaneously deals with the number of claims (zero or more) and the amount of each claim. The number of claims is from a Poisson-based discrete distribution. Individual claim sizes are from a continuous right skewed distribution. The resulting distribution of total claim size is a mixed discrete-continuous model, with positive probability of a zero claim. The means and dispersions of the claim frequency and claim size distribution are modeled in terms of risk factors. The model is applied to a car insurance data set.  相似文献   
68.
Abstract

Von den Vorschriften, die für die staatlich unterstützten Krankenkassen gelten, werde ich hier nur so viel mitteilen, was für den Aufbau der mathematiscben Formeln notwendig ist, sowie einige Tatsachen, die bei Vergleichen mit anderen Kränklichkeitsmaterialien von Bedeutung sind. Im übrigen wird auf das Gesetz über Krankenkassen hingewiesen.  相似文献   
69.
We investigate the asymmetric relationship between returns and implied volatility for 20 developed and emerging international markets. In particular we examine how the sign and size of return innovations affect the expectations of daily changes in volatility. Our empirical findings indicate that the conditional contemporaneous return-volatility relationship varies not only based on the sign of the expected returns but also upon their magnitude, according to recent results from the behavioral finance literature. We find evidence of an asymmetric and reverse return-volatility relationship in many advanced, Asian, Latin-American, European and South African markets. We show that the US market displays the highest reaction to price falls, Asian markets present the lowest sensitivity to volatility expectations, while the Euro area is characterized by a homogeneous response both in terms of direction and impact. These results may be safely attributed to cultural and societal characteristics. An extensive quantile regression analysis demonstrates that the detected asymmetric pattern varies particularly across the extreme distribution tails i.e., in the highest/lowest quantile ranges. Indeed, the classical feedback and leverage hypotheses appear not plausible, whilst behavioral theories emerge as the new paradigm in real-world applications.  相似文献   
70.
This article explores the relationships between several forecasts for the volatility built from multi-scale linear ARCH processes, and linear market models for the forward variance. This shows that the structures of the forecast equations are identical, but with different dependencies on the forecast horizon. The process equations for the forward variance are induced by the process equations for an ARCH model, but postulated in a market model. In the ARCH case, they are different from the usual diffusive type. The conceptual differences between both approaches and their implication for volatility forecasts are analysed. The volatility forecast is compared with the realized volatility (the volatility that will occur between date t and t + ΔT), and the implied volatility (corresponding to an at-the-money option with expiry at t + ΔT). For the ARCH forecasts, the parameters are set a priori. An empirical analysis across multiple time horizons ΔT shows that a forecast provided by an I-GARCH(1) process (one time scale) does not capture correctly the dynamics of the realized volatility. An I-GARCH(2) process (two time scales, similar to GARCH(1,1)) is better, while a long-memory LM-ARCH process (multiple time scales) replicates correctly the dynamics of the implied and realized volatilities and delivers consistently good forecasts for the realized volatility.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号