首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper investigates the implementation of integrated reporting (IR) by Generali, one of the most important listed companies in Italy. The research questions we aim at answering are the following: Is the IR approach to materiality inherently different from the sustainability reporting (SR) approach? Does IR lead to the identification of different material topics than does SR? On the one hand, institutional theory suggests that IR and SR material topics are going to be significantly different because IR is mainly driven by a market logic, whereas SR is inspired by a stakeholder logic. On the other hand, organizational change theory predicts there will be some resistance to change by the organization, therefore leading to IR and SR topics being similar. In order to answer our research questions, we implement two empirical analyses. First, we propose and develop an innovative methodological approach on the basis of content analysis, which allows measuring the materiality of different issues under the IR approach. Second, we rely on evidence obtained through interviews, which suggests that IR and SR approaches to materiality are inherently different.  相似文献   

2.
We explore a new approach to the forecasting of macroeconomic variables based on a dynamic factor state space analysis. Key economic variables are modeled jointly with principal components from a large time series panel of macroeconomic indicators using a multivariate unobserved components time series model. When the key economic variables are observed at a low frequency and the panel of macroeconomic variables is at a high frequency, we can use our approach for both nowcasting and forecasting purposes. Given a dynamic factor model as the data generation process, we provide Monte Carlo evidence of the finite-sample justification of our parsimonious and feasible approach. We also provide empirical evidence for a US macroeconomic dataset. The unbalanced panel contains quarterly and monthly variables. The forecasting accuracy is measured against a set of benchmark models. We conclude that our dynamic factor state space analysis can lead to higher levels of forecasting precision when the panel size and time series dimensions are moderate.  相似文献   

3.
A bstract This paper suggests the possibility of an interdisciplinary, tripartite merger of transaction cost economics and the concept of embeddedness, with feminist insights. It demonstrates that in isolation, a simple application of transaction cost analysis can offer an adequate explanation of economic activity. The explanatory power of this approach however, is enhanced when complemented by greater recognition of the importance of the social context in which economic activity occurs. This paper uses research from New Zealand's largest street market to examine women's work in street commerce, a sub-sector of the informal sector. Aspects of transaction cost analysis are applied to activities of women market vendors. It is proposed however, that the approach we take which considers the embeddedness of economic activity in ongoing networks of social relations, and the intertwining of economic with non-economic goals, is compatible with aspects of feminism. Novel features of the analysis include the application of transaction cost analysis to informal sector activity and a synthesis of this approach with a feminist oriented network analysis.  相似文献   

4.
In this paper we bring a novel approach to the theory of tournament rankings. We combine two different theories that are widely used to establish rankings of populations after a given tournament. First, we use the statistical approach of paired comparison analysis to define the performance of a player in a natural way. Then, we determine a ranking (and rating) of the players in the given tournament. Finally, we show, among other properties, that the new ranking method is the unique one satisfying a natural consistency requirement.  相似文献   

5.
Covariate information is often available in randomised clinical trials for each subject prior to treatment assignment and is commonly utilised to make covariate adjustment for baseline characteristics predictive of the outcome in order to increase precision and improve power in the detection of a treatment effect. Motivated by a nonparametric covariance analysis, we study a projection approach to making objective covariate adjustment in randomised clinical trials on the basis of two unbiased estimating functions that decouple the outcome and covariate data. The proposed projection approach extends a weighted least‐squares procedure by projecting one of the estimating functions onto the linear subspace spanned by the other estimating function that is E‐ancillary for the average treatment effect. Compared with the weighted least‐squares method, the projection method allows for objective inference on the average treatment effect by exploiting the treatment specific covariate–outcome associations. The resulting projection‐based estimator of the average treatment effect is asymptotically efficient when the treatment‐specific working regression models are correctly specified and is asymptotically more efficient than other existing competitors when the treatment‐specific working regression models are misspecified. The proposed projection method is illustrated by an analysis of data from an HIV clinical trial. In a simulation study, we show that the proposed projection method compares favourably with its competitors in finite samples.  相似文献   

6.
We provide an overview of the grounded theory approach, a methodology with significant (and largely untapped) potential for human resources (HR) research. Grounded theory is an abductive, data-driven, theory-building approach that can serve as a conceptual link between inductive and deductive research approaches. We begin by explaining the grounded theory approach in detail and outlining two versions of the method that have been used in high-impact management publications—the Gioia approach and the Tabula Geminus (twin slate) approach. We then provide an overview of the similarities and differences between grounded theory and other inductive and/or qualitative methodologies, namely, ethnography, discourse analysis, rhetorical analysis, and content analysis. Following this discussion, we offer a step-by-step guide to using grounded theory in human resources research, illustrating these principles with data and processes from extant research. Finally, we conclude by discussing best practices for achieving rigor with the grounded theory approach.  相似文献   

7.
Panel Data模型设定的新思路--固定效应与随机效应的统一   总被引:1,自引:0,他引:1  
经典PanelData模型研究中一直存在着固定效应与随机效应的判断与争论问题,这种模型设定形式的不准确常常导致模型参数估计的无效性以及一维(one-way)与二维(two-way)误差成分模型的混淆。在此,本文提出建立一种新的同时囊括随机与固定两种效应的误差成分一般模型,其中一维(one-way)情况的模型仅为二维(two-way)模型的特例模型,单一的随机效应或固定效应模型亦为其特殊情况的一种。在这种一般误差成分模型的基础上,我们力图将有关PanelData模型的讨论纳入到一个更加一般和统一的分析构架中予以研究。  相似文献   

8.
The statistical analysis of empirical questionnaire data can be hampered by the fact that not all questions are answered by all individuals. In this paper we propose a simple practical method to deal with such item nonresponse in case of ordinal questionnaire data, where we assume that item nonresponse is caused by an incomplete set of answers between which the individuals are supposed to choose. Our statistical method is based on extending the ordinal regression model with an additional category for nonresponse, and on investigating whether this extended model describes and forecasts the data well. We illustrate our approach for two questions from a questionnaire held amongst a sample of clients of a financial investment company.  相似文献   

9.
There is no shortage of literature studying the relationships between the various processes of human resources management, considered individually, and the strategy of the company. Nevertheless, studies that adopt a joint approach are scarce. In this study, working from the universalist, contingent and configurational perspectives, we seek to identify the possible existence of human resources management models and their links with the strategy of the company. Empirical analysis conducted with 130 industrial companies reveals three distinct models of human resources management but with behaviours independent of the strategies followed by the companies. At the same time we find, within each model, orientations of particular processes that are common among them and are thus characteristic of a universalist approach.  相似文献   

10.
We develop a method for eco-efficiency analysis of consumer durables that is based on Data Envelopment Analysis (DEA). In contrast to previous product efficiency studies, we consider the measurement problem from the policy perspective. The innovation of the paper is to measure efficiency in terms of absolute shadow prices that are optimized endogenously within the model to maximize efficiency of the good. Thus, the efficiency measure has a direct economic interpretation as a monetary loss due to inefficiency, expressed in some currency unit. The advantages as well as technical differences between the proposed approach and the traditional production-side methods are discussed in detail. We illustrate the approach by an application to eco-efficiency evaluation of Sport Utility Vehicles.  相似文献   

11.
Hedonic prices,demands for urban housing amenities,and benefit estimates   总被引:1,自引:0,他引:1  
This paper uses a Rosen, two-step, hedonic price-trait demand approach to estimate demand functions for a vector of urban amenities. To ascertain whether this theoretically preferred approach yields benefit estimates which differ from the oft-used Ridker-Henning, one-step, hedonic approach we conduct a sensitivity analysis. We find that the two-step approach does yield different benefit estimates and that the differences are large for some amenities. The estimates are sensitive to the functional form of the hedonic equation when the forms are significantly different according to modified Box-Cox results, but are not particularly sensitive to specification of the amenity demand equation.  相似文献   

12.
The Treasury's forecast, published with the Autumn Statement, has been widely heralded as showing a surprisingly cheerful picture for next year as far as both output and inflation are concerned. In fact it is close to the forecast which we produced in October. Here we compare the two forecasts and then consider how our forecast is affected when we adopt the Treasury assumptions on asset sales and the exchange rate. We find that the Treasury is more optimistic than we are on investment and that holding the exchange rate - which is needed to produce the official inflation forecast - requires rather higher interest rates than we assumed in October and this widens the gap between our forecast for GDP and the Treasury's forecast.
We also consider how the government should respond to lower North Sea oil revenues. Taking a permanent income approach, we suggest that the PSBR should be allowed to rise by £2bn on this basis. The same approach, however, suggests that an extra £71/2bn of asset sales should be used to cut the PSBR not taxes. On balance therefore this analysis indicates that next year's PSBR target should be lowered by £1/2bn from the £71/2 bn contained in the 1985 MTFS.  相似文献   

13.
Social cohesion dates back to the end of the nineteenth century. Back then, society experienced epochal transformations, as are also happening nowadays. Whenever there are epochal changes, a social order (cohesion) matter arises. The paper provides a conceptual scheme of social cohesion identifying its constituent dimensions subdivided by three spheres (macro, meso, micro) and two perspectives (objective and subjective). The overarching aim is to test the validity of the operationalization of the social cohesion model provided. Firstly, we conducted an exploratory factor analysis introducing an approach implemented in Mplus named exploratory structural equation modeling that shows several useful characteristics. Afterward, through a structural equation modeling approach, we performed several confirmatory factor analyses adopting a multiple group SEM strategy in order to cross-validate the social cohesion model.  相似文献   

14.
Heterogeneous agent models (HAMs) in finance and economics are often characterised by high dimensional nonlinear stochastic differential or difference systems. Because of the complexity of the interaction between the nonlinearities and noise, a commonly used, often called indirect, approach to the study of HAMs combines theoretical analysis of the underlying deterministic skeleton with numerical analysis of the stochastic model. However, it is well known that this indirect approach may not properly characterise the nature of the stochastic model. This paper aims to tackle this issue by developing a direct and analytical approach to the analysis of a stochastic model of speculative price dynamics involving two types of agents, fundamentalists and chartists, and the market price equilibria of which can be characterised by the stationary measures of a stochastic dynamical system. Using the stochastic method of averaging and stochastic bifurcation theory, we show that the stochastic model displays behaviour consistent with that of the underlying deterministic model when the time lag in the formation of price trends used by the chartists is far away from zero. However, when this lag approaches zero, such consistency breaks down.  相似文献   

15.
A number of recent articles have attempted to restore the use of a simple measure of the money supply as an indicator of future price levels and to re-establish a causal link from money to prices. Most notably Hallman, Porter and Small (HPS) (1989a), (1989b) originated the approach using US data and Hannah and James(1989) have applied it to the UK The approach broadens the traditional idea of a constant velocity of money by introducing the notion of V* and Q*, the long-run value of velocity and income. These are then used to define P from the traditional quantity theory of money as the long-run equilibrium price level. The analysis then proceeds to estimate a standard Error Correction Model (ECM) for price determination with the levels effect given by (P-P*)t-1. The conclusion drawn is that 'a measure of money that determines the long-run future level of prices is useful in determining the proper monetary policy for attaining price stability. We have shown, through the construction of P*, that M2 can serve as this determinant for the price level' (Hallman, Porter and Small (1982a) p. 23).
We argue in this paper that the P* approach is flawed. It is certainly more complex than traditional monetarist approaches but the fundamental questions of causality are in no way either affected or resolved. The P* analysis is a variant on more conventional cointegration analysis (Engle and Granger (1987), Johansen (1988), Hall (1989)) and we argue that the Johansen framework allows us to address the question in a formal and more complete way. When this approach is applied to the US data used by HPS, we find that while the P* relationship does indeed represent a cointegrating one, it does not have a causal link with prices but rather the causality runs from prices to money - this result conforms well to the work of Hendry and Ericsson (1990) or Hall, Henry and Wilcox (1990), which use this form of relationship to model the demand for money.  相似文献   

16.
This paper is interested in the modelling of the relationship between active and passive labour market policies and the aggregate unemployment outflow rate. Our model is based on a matching function and includes a simple representation of the competition between various groups of job searchers. The empirical analysis uses Belgian data. Faced with variables that are often integrated of order 1 according to the usual tests but which cannot strictly speaking be integrated, we contribute to an important methodological debate by comparing the conclusions of a classical econometric analysis and a cointegration approach.  相似文献   

17.
This paper provides an approach to the analysis of time series seasonal pattern similarities based on a special MDS approach — the non-metric SSA-I (Smallest Space Analysis) technique. Indices of dissimilarity for time series are defined generally while special cases drawn from the economic problems are treated by means of examples. The basic contributions of the paper are two-fold: First we extend the use of SSA-I to time series analysis by transforming the mutual relationship between (as well as within) the time series in a symmetric matrix. As a result, the tool of SSA-I developed by L. Guttman may easily be used. Second, by an introduction of non-metric techniques such as SSA-I in time series analysis we increase our capacity to deal with problems hitherto unsolved. In particular, ordinal data as well as behavioral data for which model processes are not defined and seasonal patterns similarities may be studied by our technique.  相似文献   

18.
By means of an integration of decision theory and probabilistic models, we explore and develop methods for improving data privacy. Our work encompasses disclosure control tools in statistical databases and privacy requirements prioritization; in particular we propose a Bayesian approach for the on-line auditing in Statistical Databases and Pairwise Comparison Matrices for privacy requirements prioritization. The first approach is illustrated by means of examples in the context of statistical analysis on the census and medical data, where no salary (resp. no medical information), that could be related to a specific employee (resp. patient), must be released; the second approach is illustrated by means of examples, such as an e-voting system and an e-banking service that have to satisfy privacy requirements in addition to functional and security ones. Several fields in the social sciences, economics and engineering will benefit from the advances in this research area: e-voting, e-government, e-commerce, e-banking, e-health, cloud computing and risk management are a few examples of applications for the findings of this research.  相似文献   

19.
Through building and testing theory, the practice of research animates data for human sense-making about the world. The IS field began in an era when research data was scarce; in today's age of big data, it is now abundant. Yet, IS researchers often enact methodological assumptions developed in a time of data scarcity, and many remain uncertain how to systematically take advantage of new opportunities afforded by big data. How should we adapt our research norms, traditions, and practices to reflect newfound data abundance? How can we leverage the availability of big data to generate cumulative and generalizable knowledge claims that are robust to threats to validity? To date, IS academics have largely welcomed the arrival of big data as an overwhelmingly positive development. A common refrain in the discipline is: more data is great, IS researchers know all about data, and we are a well-positioned discipline to leverage big data in research and teaching. In our opinion, many benefits of big data will be realized only with a thoughtful understanding of the implications of big data availability and, increasingly, a deliberate shift in IS research practices. We advocate for a need to re-visit and extend traditional models that are commonly used to guide much of IS research. Based on our analysis, we propose a research approach that incorporates consideration of big data—and associated implications such as data abundance—into a classic approach to building and testing theory. We close our commentary by discussing the implications of this hybrid approach for the organization, execution, and evaluation of theory-informed research. Our recommendations on how to update one approach to IS research practice may have relevance to all theory-informed researchers who seek to leverage big data.  相似文献   

20.
This paper explores what sustainability managers do when attempting to scale sustainability to a strategic level within their organization. Drawing on semistructured interview data with 44 sustainability managers in large, for‐profit companies, we identify three distinct scaling microstrategies that individuals use when scaling sustainability. We label these conforming, leveraging, and shaping. Our analysis also finds that sustainability managers deploy combinations of these microstrategies in three distinct approaches, which we call the assimilation approach, the mobilization approach, and the transition approach. Finally, we interrogate the degree to which employing these different approaches achieves a peripheral, intermediate, or strategic scale of sustainability within the organizations represented in the study. Our paper contributes to theory and practice at the interface of strategy and sustainability by developing a practice‐based Scaling Approach Framework, whereby an assimilation approach is associated with organizations with sustainability at a peripheral scale, a mobilization approach is associated with an intermediate scale of sustainability, and a transition approach is associated with scaling sustainability to a strategic level. From these results, we propose a Scaling Progression Model that reflects the phases that individuals progress through when scaling sustainability.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号