首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   187篇
  免费   1篇
财政金融   48篇
工业经济   9篇
计划管理   44篇
经济学   49篇
运输经济   1篇
贸易经济   25篇
农业经济   6篇
经济概况   6篇
  2024年   4篇
  2023年   2篇
  2022年   1篇
  2021年   2篇
  2020年   5篇
  2019年   6篇
  2018年   6篇
  2017年   9篇
  2016年   6篇
  2015年   3篇
  2014年   7篇
  2013年   36篇
  2012年   14篇
  2011年   7篇
  2010年   13篇
  2009年   8篇
  2008年   5篇
  2007年   14篇
  2006年   8篇
  2005年   4篇
  2004年   4篇
  2003年   3篇
  2002年   3篇
  2001年   3篇
  2000年   4篇
  1999年   1篇
  1998年   4篇
  1997年   1篇
  1996年   1篇
  1995年   1篇
  1988年   1篇
  1985年   1篇
  1970年   1篇
排序方式: 共有188条查询结果,搜索用时 0 毫秒
151.
This paper presents the technoeconomic evaluation of a third generation rollout scenario followed by the identification of the market conditions for two operators in a simple game theory model. The considered scenarios reflect the point of view of both dominant operators and new entrants. Technoeconomic results are presented in terms of net present value (NPV), acting like the pay-off function in the proposed theoretical Game Theory model. The business case presented here was studied by using a tool, developed by European Projects IST-25172 TONIC/ECOSYS, a model developed by the EURESCOM P901 project and the Game Theory Tool named Gambit.  相似文献   
152.
This paper aims to propose a new framework for estimating and forecasting diffusion of high technology products, along with the construction of a price index. Into that context, the “diffusion–price” model is presented, as an innovative concept providing a long term estimation of both price and diffusion elasticity. This corresponds to the bidirectional estimation of the mutual influence of the product’s price over its expected diffusion and vice versa. The discrete parts of the methodology are the use of a diffusion model for the initial estimation of diffusion, the construction of a price index function for estimating the pricing mechanism and, finally, the construction of the “diffusion–price” model for estimating and adjusting the diffusion level and price quantities. The case studies examined, whose solution was based on genetic algorithms, revealed remarkable results which can be used for business strategies development, as the pricing policy is able to make diffusion diverge substantially from the initial estimates. The case studies considered correspond to the ADSL technology diffusion in the wider European area.  相似文献   
153.
The present article attempts to investigate the validity of the Comanor–Wilson Minimum Efficient Size (MES) measure. The basic assumption is that firms that have exhausted scale economies are in non-increasing returns to scale. The same firms are also assumed to have a size greater than MES estimated on sales (total turnover), employment or fixed assets. Data Envelopment Analysis (DEA) is used, on a sample of firms in three Greek manufacturing industries, to classify firms in operation according to increasing or non-increasing returns to scale. On the basis of the results of the DEA input oriented model, the MES measure correctly predicts over 85% of the cases. A probit model is applied to those cases that are not identically predicted by MES concerning returns to scale. Results indicate that technical efficiency, size and age are the factors that compel MES to yield the same prediction as the DEA approach.
Kostas TsekourasEmail:
  相似文献   
154.
  总被引:2,自引:0,他引:2  
In this paper, we reconsider the appropriateness of certain statistical analyses in innovation adoption studies and suggest that partial observability models may sometimes be more useful. The proposed models allow for a flexible specification of the process of adoption from one stage to two stages, facilitate the modelling of non‐adopters and remedy the violation of the assumption of full information. An application to the adoption of organic cultivation in Greece demonstrates the relative merits of the proposed analysis.  相似文献   
155.
This study tries to fill a vacuum in the literature on the relevance of economic fundamentals for the Euro / USD exchange rate determination. We adopt the Monetary Model for the Exchange Rate Determination as our testing vehicle and investigate the relevance of various versions of this model over a long time horizon, spanning the period from the inception of Euro till the present time. We rely on cointegration analysis to conduct our empirical research and in accordance to the relevant literature we fail to accept most of the variants of this model. However, we get encouraging results from an expanded version of the Monetary Model where demand and productivity factors appear in the set of the exchange rate determinants.  相似文献   
156.
    
We revisit the methodology and historical development of subsampling, and then explore in detail its use in hypothesis testing, an area which has received surprisingly modest attention. In particular, the general set‐up of a possibly high‐dimensional parameter with data from K populations is explored. The role of centring the subsampling distribution is highlighted, and it is shown that hypothesis testing with a data‐centred subsampling distribution is more powerful. In addition we demonstrate subsampling’s ability to handle a non‐standard Behrens–Fisher problem, i.e., a comparison of the means of two or more populations which may possess not only different and possibly infinite variances, but may also possess different distributions. However, our formulation is general, permitting even functional data and/or statistics. Finally, we provide theory for K ‐ sample U ‐ statistics that helps establish the asymptotic validity of subsampling confidence intervals and tests in this very general setting.  相似文献   
157.
    
Since the banking crisis the market for volatility exchange‐traded products has developed rapidly as it opens to clients beyond the large institutional investor pool. Speculation is driven by increasingly complex leveraged and inverse exposures including those that attempt to trade on significant roll costs in volatility futures curves. Longer‐term investors use these products for the purposes of equity diversification, driven by fears of an ongoing Eurozone crisis. We survey the burgeoning academic literature in this area and present a comprehensive and up‐to‐date comparison of the market and statistical characteristics of European and US exchange‐traded volatility products.  相似文献   
158.
    
In the aim to explore the complex relationships between S&P500, VIX and volume we introduce a Granger causality test using the nonlinear statistic of Asymmetric Partial Transfer Entropy (APTE). Through a simulation exercise, it arises that the APTE offers precise information on the nature of the connectivity. Our empirical findings concretize the information flow that links volume, S&P500 and VIX, and merge the leverage effect and the asymmetric stock return-volume relationship into a unified framework of analysis. More specifically, when we condition on the tails, the detected causal channel provides empirical validation of the noise trading contribution to large swings in financial markets, because of the increase of trading volume and the subsequent worsening ability of market prices to adjust to new information.  相似文献   
159.
The year 2010 is a key year for European railway transport as it marks the liberalization of the railway sector in a context of economic crisis. The railway sector is a driving force behind the economy of any country. In the case of Spain, in particular, the sector is undergoing a process of liberalization following large public investments that have provided the country with one of the most extensive high-speed railway networks in Europe. Using a methodological approach that seeks a balance between future studies and constructivist studies on the interaction between technology and society, we examine the present and future consequences of railway transport liberalization, in the case-study of Spain, focusing on a key aspect of the process: changes in occupational health and safety conditions in a sector that must ensure full passenger, worker and freight safety. Through a comparison of actual risks, perceived risks and foreseeable risks, we analyze the main shortcomings of the liberalization model that is currently being implemented and strategies for dealing with foreseeable risks in a scenario of change.  相似文献   
160.
    
This article presents estimates of labour values and prices of production following two approaches: the first is based on the classical and Marxian theory of value and distribution; the second on the so-called ‘new solution’ to the ‘transformation problem’ and its variant, the Temporary Single-System Interpretation (TSSI). The major advantage of the latter approach is its simplicity, along with the relatively low data requirements. Our empirical findings from the economies of China, Japan and South Korea suggest that both approaches give estimates of labour values and prices of production which are extremely close to each other as well as to actual market prices. On further examination, however, we conclude that our empirical findings are absolutely consistent with the theoretical requirements of the classical approach and contradict those of the TSSI.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号