首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Third order rotatability of experimental designs, moment matrices and information surfaces is investigated, using a Kronecker power representation. This representation complicates the model but greatly simplifies the theoretical development, and throws light on difficulties experienced in some previous work. Third order rotatability is shown to be characterized by the finitely many transformations consisting of permutations and a bi-axial 45 degree rotation, and the space of rotatable third order symmetric matrices is shown to be of dimension 20, independent of the number of factorsm. A general Moore-Penrose inverse of a third order rotatable moment matrix is provided, leading to the information surface, and the corresponding optimality results are discussed. After a brief literature review, extensions to higher order models, the connections with tensor representations of classic matrix groups, and the evaluation of a general dimension formula, are all explored.  相似文献   

2.
Screening designs are useful for situations where a large number of factors are examined but only a few, k, of them are expected to be important. Traditionally orthogonal arrays such as Hadamard matrices and Plackett Burman designs have been studied for this purpose. It is therefore of practical interest for a given k to know all the classes of inequivalent projections of the design into the k dimensions that have certain statistical properties. In this paper we present 15 inequivalent Hadamard matrices of order n=32 constructed from circulant cores. We study their projection properties using several well-known statistical criteria and we provide minimum generalized aberration 2 level designs with 32 runs and up to seven factors that are embedded into these Hadamard matrices. A concept of generalized projectivity and design selection of such designs is also discussed.AMS Subject Classification: Primary 62K15, Secondary 05B20  相似文献   

3.
Models of club goods, local public goods, and growth controls appear to have theoretical structures distinct from usual oligopoly models. This article shows, however, that they are special cases of a generalized oligopoly model that incorporates the possibility of two-part pricing and externalities between consumers (either congestion or network externalities). Our generalized two-part pricing model not only serves as a synthesis of a wide range of models but also allows us to obtain several new results on equilibrium prices. Another advantage of our model is that it can be interpreted as a reduced form of more complicated models that have spatial structures. This facilitates extension to the case where firms are heterogeneous and the number of firms is arbitrary.  相似文献   

4.
Summary The analysis of the usual balanced incomplete block designs is extended to the case where, in addition to the experimental treatments, one or several control treatments are contained in each block.   相似文献   

5.
This paper presents the results of an experimental determination by Monte Carlo techniques of the power functions of a generalized Friedman's rank test based on "standardized" ranks and a proposed test procedure based on aligned ranks for orthogonal designs. The simulation is carried out for five orthogonal designs and a number of Normal location alternatives and gives some information about the difference in power of the two test procedures for some orthogonal designs.  相似文献   

6.
Assuming transaction cost economics as a normative tool, we investigate the relationship between firms' ‘observed’ vertical integration choices and their economic performance. We use a two‐stage methodology: in the first, a measure of governance misalignment is computed as a difference between the governance form (i.e., ownership or outsourcing) predicted by transaction cost economics and the form actually observed; the second stage consists of estimating a performance equation where the misalignment variable is introduced together with a set of independent variables. Compared with previous studies, we introduce two novelties: we use the business group as the unit of analysis to detect the ownership of vertically related productions; we assess the moderating role of geographic agglomeration in reducing the need of vertical integration. Our results confirm the importance of technology and price uncertainty in influencing vertical integration; moreover, the misalignment variable is significant in the case of profitability, but not in the case of growth. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

7.
Technological innovation depends on knowledge developed by scientific research. The number of citations made in patents to the scientific literature has been suggested as an indicator of this process of transfer of knowledge from science to technology. We provide an intersectoral insight into this indicator, by breaking down patent citations into a sector-to-sector matrix of knowledge flows. We then propose a method to analyze this matrix and construct various indicators of science intensity of sectors, and the pervasiveness of knowledge flows. Our results indicate that the traditional measure of the number of citations to science literature per patent captures important aspects of intersectoral knowledge flows, but that other aspects are not captured. In particular, we show that high science intensity implies that sectors are net suppliers of knowledge in the economic sector, but that science intensity does not say much about pervasiveness of either knowledge use or knowledge supply by sectors. We argue that these results are related to the specific and specialized nature of knowledge.  相似文献   

8.
We offer a policy-basis for interpreting, justifying, and designing (3, 3)-political rules, a large class of collective rules analogous to those governing the selection of papers in peer-reviewed journals, where each referee chooses to accept, reject, or invite a resubmission of a paper, and an editor aggregates his own and referees’ opinions into one of these three recommendations. We prove that any such rule is a weighted multicameral rule: a policy is collectively approved at a given level if and only if it is approved by a minimal number of chambers — the dimension of the rule — where each chamber evaluates a different aspect of the policy using a weighted rule, with each evaluator’s weight or authority possibly varying across chambers depending on his area(s) of expertise. These results imply that a given rule is only suitable for evaluating finite-dimensional policies whose dimension corresponds to that of the rule, and they provide a rationale for using different rules to pass different policies even within the same organization. We further introduce the concept of compatibility with a rule and exploit its topological properties to propose a method to construct integer weights corresponding to evaluators’ possible judgments under a given rule, which are more intuitive and easier to interpret for policymakers. Our findings shed light on multicameralism in political institutions and multi-criteria group decision-making in the firm. We provide applications to peer review politics, rating systems, and real-world organizations.  相似文献   

9.
Stochastic Frontiers and Asymmetric Information Models   总被引:2,自引:0,他引:2  
This article is an attempt to shed light on the specification and identification of inefficiency in stochastic frontiers. We consider the case of a regulated firm or industry. Applying a simple principal-agent framework that accounts for informational asymmetries to this context, we derive the associated production and cost frontiers. Noticeably this approach yields a decomposition of inefficiency into two components: The first component is a pure random term while the second component depends on the unobservable actions taken by the agent (the firm). This result provides a theoretical foundation to the usual specification applied in the literature on stochastic frontiers. An application to a panel data set of French urban transport networks serves as an illustration.  相似文献   

10.
Construction of block designs admitting an action of a presumed automorphism group consists of two basic steps: 1.construction of orbit structures for the given automorphism group, 2. construction of block designs for the orbit structures obtained in 1. For an abelian group G, the second step, called indexing, often lasts too long, because there are too many possibilities to be enumerated with today's computer facilities. To make such a construction possible we use a principal series of the group G to obtain a refinement of the orbit structures.  相似文献   

11.
We study a dynamic model of opinion formation in social networks. In our model, boundedly rational agents update opinions by averaging over their neighbors’ expressed opinions, but may misrepresent their own opinion by conforming or counter-conforming with their neighbors. We show that an agent׳s social influence on the long-run group opinion is increasing in network centrality and decreasing in conformity. Concerning efficiency of information aggregation or “wisdom” of the society, it turns out that misrepresentation of opinions need not undermine wisdom, but may even enhance it. Given the network, we provide the optimal distribution of conformity levels in the society and show which agents should be more conforming in order to increase wisdom.  相似文献   

12.
The expression “the bell curve” designs both a kind of statistical distribution and the title of a famous and controversial book by Herrnstein and Murray. The first is so attractive that the second refers to it to give more credibility to its questionable theories on intelligence. The point is that, during the 20th century, the bell curve has assumed a more and more important role in psychological research and practice and have become both a reality and a myth. In the first case (reality) we can assist to appropriate applications of a real useful statistical concept. In the second (myth) we can have two kinds of attitudes: one attitude is typical of those researchers who search for normality in all their data and variables, just as Parsifal used to search for the Holy Graal (we call this “the Parsifal attitude”); the other is typical of those researchers who give normality for granted and act as if it were a Platonic Idea (we call this “the Plato attitude”). The article discusses the role of the normal distribution in psychological research and practice and shows how it can be dangerous to treat the bell curve as a God or an Idol.  相似文献   

13.
This paper examines the current state of application of qualitative methods, namely case studies in purchasing and supply management. We argue that the case study method has much to contribute to the development of the discipline namely in terms of theory development, providing strong exemplars as well as testing theories culled from other disciplines. In examining the use of the case method in purchasing and supply management, we suggest that there is a noticeable trend away from single case designs with sparse methodological reflections to multiple case, comparative designs accompanied by the use of conventional method justifications. These developments are broadly welcomed but we identify two blind spots: (1) the relative neglect of the links between theory and method and (2) the use of inappropriate statistical criteria to justify multiple case research designs. We discuss the nature of these problems using a number of examples and formulate rules for conducting good case research.  相似文献   

14.
Forecast combination through dimension reduction techniques   总被引:2,自引:0,他引:2  
This paper considers several methods of producing a single forecast from several individual ones. We compare “standard” but hard to beat combination schemes (such as the average of forecasts at each period, or consensus forecast and OLS-based combination schemes) with more sophisticated alternatives that involve dimension reduction techniques. Specifically, we consider principal components, dynamic factor models, partial least squares and sliced inverse regression.Our source of forecasts is the Survey of Professional Forecasters, which provides forecasts for the main US macroeconomic aggregates. The forecasting results show that partial least squares, principal component regression and factor analysis have similar performances (better than the usual benchmark models), but sliced inverse regression shows an extreme behavior (performs either very well or very poorly).  相似文献   

15.
The ability to design experiments in an appropriate and efficient way is an important skill, but students typically have little opportunity to get that experience. Most textbooks introduce standard general‐purpose designs, and then proceed with the analysis of data already collected. In this paper we explore a tool for gaining design experience: computer‐based virtual experiments. These are software environments which mimic a real situation of interest and invite the user to collect data to answer a research question. Two prototype environments are described. The first one is suitable for a course that deals with screening or response surface designs, the second one allows experimenting with block and row‐column designs. They are parts of a collection we developed called ENV2EXP, and can be freely used over the web. We also describe our experience in using them in several courses over the last few years.  相似文献   

16.
We consider the following problem. There is a structural equation of interest that contains an explanatory variable that theory predicts is endogenous. There are one or more instrumental variables that credibly are exogenous with regard to this structural equation, but which have limited explanatory power for the endogenous variable. Further, there is one or more potentially ‘strong’ instruments, which has much more explanatory power but which may not be exogenous. Hausman (1978) provided a test for the exogeneity of the second instrument when none of the instruments are weak. Here, we focus on how the standard Hausman test does in the presence of weak instruments using the Staiger–Stock asymptotics. It is natural to conjecture that the standard version of the Hausman test would be invalid in the weak instrument case, which we confirm. However, we provide a version of the Hausman test that is valid even in the presence of weak IV and illustrate how to implement the test in the presence of heteroskedasticity. We show that the situation we analyze occurs in several important economic examples. Our Monte Carlo experiments show that our procedure works relatively well in finite samples. We should note that our test is not consistent, although we believe that it is impossible to construct a consistent test with weak instruments.  相似文献   

17.
In this paper, we use several indicators of trade informativeness to search for informed traders on the final trading days of Banco Popular, the first and only bank resolution case to date in the euro area. In particular, we use the model proposed by Preve and Tse (2013) to estimate the adjusted daily probability of informed trading and the probability of symmetric order-flow shock using high-frequency transaction data. Our empirical results indicate that upon the anticipation of a possible liquidation of the bank, informed investors reacted to the bad news by placing more weight on it and that Banco Popular experienced large increases in both buy- and sell-orders during the last days of trading when the bank registered a significant depletion of its deposit base. Moreover, we find evidence supporting the presence of inside trading and illiquidity, especially after speculation in the media that the bank could face a liquidation. Our study has important implications for market participants and regulatory authorities.  相似文献   

18.
本文的研究发现,由于网络规模效应的存在,各运营商推行的“集团用户”、“亲情号码”业务使得集团内“大网”对于“小网”客户的吸附作用较为明显,集团内部初始市场份额对运营商有着较为重要的影响,这导致了移动运营商之间的竞争更加激烈;同时指出“集团用户”推出的主要目的是移动运营商争取对方客户,但无形中减少了客户对固定电话的使用,固话成了移动网络竞争的牺牲品,导致移动对固话的替代作用更加明显,以上两方面加剧了中国电信业市场结构的失衡.  相似文献   

19.
王积社 《价值工程》2011,30(24):270-271
详细研究了Sn与An的群图,修正了《群图的基本理论及置换群图的构造》一文中某些错误的结论,并用计算机作出了S4与A4的部分群图。  相似文献   

20.
基于熵理论的项目组织结构评价与选择   总被引:1,自引:0,他引:1  
周栩  汤立  颜红艳 《价值工程》2006,25(10):109-111
首先分析了系统结构对系统内信息流的影响,从信息的角度对项目组织结构的有序度进行评价,引入了信息流的时效和质量的概念。然后以熵理论为基础,建立了可以进行组织结构优化设计和进行定量评价的时效质量熵模型;并给出了效质有序度的详细计算步骤。最后运用此方法通过有序度来评价某工程公司的线性组织和矩阵组织两种组织形式。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号