首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   786篇
  免费   37篇
财政金融   127篇
工业经济   79篇
计划管理   144篇
经济学   205篇
综合类   3篇
运输经济   13篇
旅游经济   14篇
贸易经济   140篇
农业经济   23篇
经济概况   51篇
邮电经济   24篇
  2023年   7篇
  2022年   6篇
  2021年   14篇
  2020年   17篇
  2019年   26篇
  2018年   31篇
  2017年   29篇
  2016年   29篇
  2015年   23篇
  2014年   35篇
  2013年   96篇
  2012年   38篇
  2011年   31篇
  2010年   36篇
  2009年   41篇
  2008年   50篇
  2007年   52篇
  2006年   21篇
  2005年   20篇
  2004年   26篇
  2003年   32篇
  2002年   25篇
  2001年   17篇
  2000年   19篇
  1999年   13篇
  1998年   12篇
  1997年   8篇
  1996年   3篇
  1995年   4篇
  1994年   5篇
  1993年   3篇
  1992年   6篇
  1991年   3篇
  1989年   3篇
  1988年   3篇
  1987年   3篇
  1986年   2篇
  1985年   3篇
  1984年   5篇
  1983年   5篇
  1982年   4篇
  1980年   2篇
  1978年   2篇
  1974年   2篇
  1970年   1篇
  1962年   1篇
  1961年   1篇
  1960年   1篇
  1958年   1篇
  1956年   1篇
排序方式: 共有823条查询结果,搜索用时 31 毫秒
61.
Conformity testing is a systematic examination of the extent to which an entity conforms to specified requirements. Such testing is performed in industry as well as in regulatory agencies in a variety of fields. In this paper we discuss conformity testing under measurement or sampling uncertainty. Although the situation has many analogies to statistical testing of a hypothesis concerning the unknown value of the measurand there are no generally accepted rules for handling measurement uncertainty when testing for conformity. Usually the objective of a test for conformity is to provide assurance of conformity. We therefore suggest that an appropriate statistical test for conformity should be devised such that there is only a small probability of declaring conformity when in fact the entity does not conform. An operational way of formulating this principle is to require that whenever an entity has been declared to be conforming, it should not be possible to alter that declaration, even if the entity was investigated with better (more precise) measuring instruments, or measurement procedures. Some industries and agencies designate specification limits under consideration of the measurement uncertainty. This practice is not invariant under changes of measurement procedure. We therefore suggest that conformity testing should be based upon a comparison of a confidence interval for the value of the measurand with some limiting values that have been designated without regard to the measurement uncertainty. Such a procedure is in line with the recently established practice of reporting measurement uncertainty as “an interval of values that could reasonably be attributed to the measurand”. The price to be paid for a reliable assurance of conformity is a relatively large risk that the procedure will fail to establish conformity for entities that only marginally conform. We suggest a two‐stage procedure that may improve this situation and provide a better discriminatory ability. In an example we illustrate the determination of the power function of such a two‐stage procedure.  相似文献   
62.
Despite evidence that information technology (IT) has recently become a productive investment for a large cross-section of firms, a number of questions remain. Some of these issues can be addressed by extending the basic production function approach that was applied in earlier work. Specifically, in this short paper we: 1) control for individual firm differences in productivity by employing a ‘firm effects’ specification, 2) consider the more flexible translog specification instead of only the Cobb-Douglas specification, and 3) allow all parameters to vary between various subsectors of the economy.

We find that while ‘firm effects’ may account for as much as half of the productivity benefits imputed to IT in earlier studies, the elasticity of IT remains positive and statistically significant. We also find that the estimates of IT elasticity and marginal product are little-changed when the less restrictive translog production function is employed. Finally, we find only limited evidence of differences in IT's marginal product between manufacturing and services and between the ‘measurable’ and ‘unmeasurable’ sectors of the economy. Surprisingly, we find that the marginal product of IT is at least as high in firms that did not grow during 1988–1992 sample period as it is in firms that grew.  相似文献   
63.
Bertrand supergames with non‐binding communication are used to study price formation and stability of collusive agreements on experimental duopoly markets. The experimental design consists of three treatments with different costs of communication: zero‐cost, low‐cost and high‐cost. Prices are found to be significantly higher when communication is costly. Moreover, costly communication decreases the number of messages, but more importantly, it enhances the stability of collusive agreements. McCutcheon (1997) presents an interesting application to antitrust policy by letting the cost of communication symbolize the presence of an antitrust law that prohibits firms from discussing prices. Although our experimental results do not support the mechanism of McCutcheon's (1997) argument, the findings point in the direction of her prediction that antitrust laws might work in the interest of firms.  相似文献   
64.
Achieving an impact on business decision-makers with foresight does not appear to be an easy task. Therefore, the Macro Trends team at Deutsche Bank Research has formulated some criteria to guide foresight projects. They should aim to produce plausibility, provide convenience and inspiration as well as an appropriate time perspective with regard to the content of foresight results. In addition, a structured way of producing and delivering foresight, a seamless inclusion in organisational procedures, a high level of interaction with decision-makers, ideational entrepreneurship, innovation regarding communication with business people, and persistence and synchronisation with the business organisation are the key criteria for achieving a higher impact from foresight projects. To live up to these criteria, the Macro Trends team has developed a 'trend map' which provides a conceptual aggregation of trends - to provide orientation for decision-makers and stakeholders.  相似文献   
65.
U.S. companies that need capital may choose between selling securities in the private and public markets. These venues differ in terms of direct issuance costs, the required information disclosed, the liability incurred, and the mechanics of the capital-raising process itself. During the last two decades, the Securities and Exchange Commission (SEC) has deregulated private offerings by broadening their investor base and increasing secondary market liquidity. At the same time, SEC policy has bifurcated the offering process in the public market into two distinct segments based largely on company size and seasoning. Large public issuers have seen a gradual deregulation and acceleration of their capitalraising processes. Important changes for issuers include allowing them to incorporate information into registration statements by reference to Exchange Act reports, to use shelf registration to speed up offers, and to place securities offshore with less regulatory uncertainty. Though small issuers enjoy some of the benefits of these changes, deregulation of their offerings has been somewhat less pronounced. In a Commission report and a subsequent concept release, the SEC indicates it may restructure and unify these three disparate strands of capital raising through an innovative schema of registering companies rather than securities.  相似文献   
66.
Complete demand systems often are estimated under the assumption that preferences are constant. If preferences do change in reality, this leads to a specification error. This note explores some of the implications of such a specification error.  相似文献   
67.
House prices, money, credit, and the macroeconomy   总被引:1,自引:0,他引:1  
This paper assesses the links between money, credit, house prices,and economic activity in industrialized countries over the lastthree decades. The analysis is based on a fixed-effects panelvector autoregression, estimated using quarterly data for 17industrialized countries spanning the period 1970–2006.The main results of the analysis are the following. (i) Thereis evidence of a significant multidirectional link between houseprices, monetary variables, and the macroeconomy. (ii) The linkbetween house prices and monetary variables is found to be strongerover a more recent sub-sample from 1985 to 2006. (iii) The effectsof shocks to money and credit are found to be stronger whenhouse prices are booming.  相似文献   
68.
69.
70.
This paper relates the financial and monetary dimensions of the contemporary economic crisis to working-class agency via a central concern of classical political economy: the distribution of surplus between the chief factors of production. The fall in the wage share of value added is now accepted as a stylised fact in the empirical economic literature. This paper argues that the punctuated pattern of the development validates the regulation theoretical narrative of an epochal shift from Fordism to finance-led accumulation. Furthermore, synthesising econometric studies supports a class-centred explanation. In the last instance, the falling wage share is due to successful transnational class rule in the form of a neoliberal hegemonic paradigm. Crucially, such class rule restructured the environment of trade unions, rendering increasingly ineffective its relational power resources. The paper concludes by considering the contradictory implications for organised labour of the current financial crisis. On the one hand, the financial crisis offers an opportunity to link its particular interests to the general interest of macroeconomic management since low wage share inhibits growth rates. But how might trade unions assert a higher wage share in the face of the structural power of (financial) capital?  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号