首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 429 毫秒
1.
The purpose of this paper is to forge a link between optimal levels of the three principal marketing mix instruments—price, promotion and product quality—and the development expenditures devoted to the quality improvement of a firm's existing product. Using the latest operational marketing mix model which prescribes quantitatively what the optimal quality of a brand should be, a simple function relating this optimal quality to desirable development outlays is suggested and some of its implications are explored.  相似文献   

2.
We study how demarketing interacts with pricing decisions to explain why and when it can be employed as the seller's optimal strategy. In our model, a monopolistic seller offers different price‐quality bundles of the product. A consumer's preference is private information. With demarketing, consumers must make a costly effort to purchase and/or utilize the product, whereas with marketing, the seller instead makes the effort so that the consumer's purchasing decision is independent of the cost of effort. Our result suggests that, for small or large effort costs, it is optimal for the seller to engage in marketing. For intermediate effort costs, however, demarketing can be optimal. With demarketing, the seller induces only the consumers with high valuation to make transaction effort. By doing so, the seller can price discriminate more effectively, thus extracting more surplus. We extend our analysis to the case where the seller can offer special deals through exclusive sales channels along with demarketing. Then, demarketing can be optimal even for large costs of effort.  相似文献   

3.
We adopt a multistage search model, in which the home seller's reservation price is determined by her or his opportunity cost, search cost, discount rate and additional market parameters. The model indicates that a greater dispersion in offer prices leads to higher reservation and optimal asking prices. A unique dataset from the Tokyo condominium resale market enables us to test those modeled hypotheses. Empirical results indicate that a one percentage point increase in the standard deviation of submarket transaction prices results in a two‐tenths of a percent increase in the initial asking price and in the final transaction price. Increases in the dispersion of market prices enhance the probabilities of a successful transaction and/or an accelerated sale.  相似文献   

4.
I consider a setting in which firms have unverifiable private information about their type, which corresponds to their probable product quality; firms can expend a learning cost in order to observe their quality; and the regulator can enforce false advertising penalties contingent only on verifiable realized quality. I show that it may be socially optimal for high type firms to signal their type through ‘speculative claims,’ rather than to learn and signal their quality. This implies that socially optimal false advertising penalties are finite, in contrast to the literature's common assumption of arbitrarily high false advertising penalties, and that the regulator optimally tolerates the existence of some false claims in equilibrium.  相似文献   

5.
This paper considers a hidden action agency problem where the principal has a single source of hidden information concerning the agent's utility, the agent's effort productivity, or the agent's cost of effort. We examine whether the principal should precommit to disclosing these different single sources of information to the agent. If the optimal contract is invariant over the hidden information and, thus, the disclosure rules (constant elasticity case), such disclosure increases the agent's utility, it can raise or lower profit and total surplus depending on the source of hidden information, and non-disclosure can be optimal if disclosure affects the agent's motivation. If the contract varies with the hidden information and, thus, disclosure rule, disclosure or non-disclosure can be optimal depending on whether the party's payoff is convex or concave in the information variable, respectively.  相似文献   

6.
In designing consumer durables such as appliances and power tools, it is important to account for variations in product performance across different usage situations and conditions. Since the specific usage of the product and the usage conditions can vary, the resultant variations in product performance also can impact consumer preferences for the product. Therefore, any new product that is designed should be robust to these variations—both in product performances and consumer preferences. This article refers to a robust product design as a design that has (1) the best possible (engineering and market) performance under the worst‐case variations and (2) the least possible sensitivity in its performance under the variations. Achieving these robustness criteria, however, implies consideration of a large number of design factors across multiple functions. This article's objectives are (1) to provide a tutorial on how variations in product performance and consumer preferences can be incorporated in the generation and comparison of design alternatives and (2) to apply a multi‐objective genetic algorithm (MOGA) that incorporates multifunction criteria in order to identify better designs while incorporating the robustness criteria in the selection process. Since the robustness criteria is based on variations in engineering performance as well as consumer preferences, the identified designs are robust and optimal from different functional perspectives, a significant advantage over extant approaches that do not consider robustness issues from multifunction perspectives. This study's approach is particularly useful for product managers and product development teams, who are charged with developing prototypes. They may find the approach helpful for obtaining customers' buy‐in as well as internal buy‐in early on in the product development cycle and thereby for reducing the cost and time involved in developing prototypes. This study's approach and its usefulness are illustrated using a case‐study application of prototype development for a handheld power tool.  相似文献   

7.
In an auction of a divisible object, bidders' demand functions are often assumed to be nonincreasing, meaning that bidders are willing to pay less or the same price for every additional unit. Under this assumption, the optimal allocation that maximizes the auctioneer's revenue can be found using a greedy-based procedure. This article argues that situations may arise where a bidder may need to express her preferences through a nondecreasing demand function; when such a bidder is present in the auction, the greedy-based procedure does not guarantee the optimal allocation. Thus, this article proposes a mixed integer program that finds the optimal allocation in a divisible-object auction at which bidders submit their bids as arbitrary stepwise demand functions. The practical aspect of the mathematical program is presented by means of a simple yet illustrative example in a treasury bond auction setting. The results of the auctioneer's revenue are reported as a function of the number of bidders with nonincreasing and nondecreasing demand functions.  相似文献   

8.
In this paper, we propose a dynamic model to simultaneously determine the optimal position of the decoupling point and production-inventory plan in a supply chain such that the total cost of the deviation from the target production rate and the target inventory level is minimized. Using the optimal control theory, we derive the closed form of the optimal solution when the production smoothing policy and the zero-inventory policy are applied. The result indicates that under the production smoothing policy, the overestimation of demand rate during the pre-decoupling stage guarantees the existence of the optimal decoupling point; meanwhile the optimal decoupling point exists under zero-inventory policy when the demand rate is underestimated. Also we perform mathematical analysis on the behavior of the optimal production rate and the inventory level and the effect of problem parameters such as the length of the product life cycle and the forecast error on the performance.  相似文献   

9.
This paper develops a coordination mechanism for a supply chain consisting of one manufacturer and n Cournot competing retailers when the production cost and demands are simultaneously disrupted. This differs from traditional supply chain coordination models under a static case and the case with only demand or cost disruption. The coordination mechanism with revenue sharing is considered, and the effects of production cost and demand disruptions on revenue sharing contract are discussed to investigate the optimal strategies of players with disruptions. The penalty cost is introduced explicitly to obtain the production deviation cost caused by the disruptions. In this study, it is obtained that the coordination contract considering the production deviation cost differs from that without disruption. Besides that, the disruptions may affect the order quantities, wholesale prices as well as revenue sharing contract. Then, the optimal strategies for different disruption levels under the centralized decision-making mode are proposed. Concerning the decentralized mode, the improved revenue sharing contract can be used to coordinate the decentralized decision-making supply chain effectively. Finally, the theoretical results are illustrated by conducting some numerical examples.  相似文献   

10.
In the absence of commitment to auditing, we study the optimal auditing contract when collusion between an agent and an auditor is possible. We show that the auditor can be totally useless if the auditor's independence can be compromised with relative ease. Even very stiff sanctions on fraud will be unable to make auditing optimal. We then derive a demand for independent external auditing. We endogenize collusion cost as the cost from the risk of future detection. We also derive a justification for the focus of the recent audit reforms on penalties on CEOs in cases of audit fraud.  相似文献   

11.
12.
This paper presents a model for calculating the optimal cutting feed rate and spindle speed in a stand-alone cutting machine. The optimal cutting conditions are determined for three different objective functions—minimum expected cycle time, minimum expected cost per unit, and maximum expected profit-rate—under the failure replacement strategy, taking into account cutting tool constraints and machine limitations. We also examine the relationships between the optimal solutions, and present the efficiency range of feed rate.  相似文献   

13.
Most cases of cost overruns in public procurement are related to important changes in the initial project design. This paper provides a rationale for the observed pattern in public procurement of underinvestment in design specification. We propose a two‐stage model in which the sponsor first decides how much to invest in design specification and auctions the project to horizontally differentiated contractors. After the contract has been awarded and implemented, the sponsor and contractor receive new information about the optimal project design and renegotiate the contract to accommodate changes in the initial project's design. We show that the sponsor's optimal strategy is to underinvest in design specification, which makes significant cost overruns likely. Since no such underinvestment occurs when contractors are not horizontally differentiated, cost overruns are seen to arise as a consequence of lack of competition in the procurement market.  相似文献   

14.
This paper investigates the effects of the number of firms and their product‐type on broadband Internet quality. We estimate a model that relates the actual speeds delivered, in census block groups to the number of wireline and wireless internet service providers (ISP's), cost and demand conditions, and correction terms for the endogeneity of market structure. Model estimates show four main findings. Wireline speeds are often higher in markets with two or more wireline ISP's than with a single wireline ISP. Excluding the correction terms from the analysis understates this effect. Increases in wireline speeds are larger in the upstream direction, and there is no relationship between wireline speeds and the number of wireless ISP's.  相似文献   

15.
This article explores how the industry life‐cycle theory, proposed by Abernathy and Utterback, can be reinterpreted from the viewpoint of product architecture dynamics. The “long tail” of the automobile industry life cycle, observed during the past several decades, is explained by an evolutionary framework in which a product's architecture is treated as an endogenous variable affected by customers' functional requirements, environmental‐technical constraints, and their changes. The present article explains how the existing industry life‐cycle model effectively explains the early history of automotive product‐process innovations, but that it fails to explain the “long tail” of the life cycle, and that an evolutionary approach of product architectures can be used to explain the architectural sequence and the long‐term trend of the increase in nonradical innovations. That is, the industry life‐cycle model certainly fits well with the actual pattern of product‐process innovations at the early phase of the automobile's development, between the 1880s (invention) through the 1920s (the end of the Model T) and into the 1960s, when product differentiation continued without significant product/process innovations (e.g., the Big Three's annual model change). But the question remains how this model can explain the rest of the industry's history (1970s to 2010s), which is characterized by “rapid incremental innovations,” or a “long tail of the life cycle,” with its upward trend of technological advancement rather than the end of innovations or the beginning of another industry life cycle (i.e., “dematurity”). The evolutionary framework of product architecture predicts that the macro architecture of a given product category (e.g., passenger cars) will be relatively integral when the functional requirements that customers expect, the constraints imposed by society and the government, and the physical‐technical limits inherent in the product are strong, and that it will be relatively modular when they are weaker. The dynamic architectural analysis starts from the Lancaster‐type analysis of a set of function‐price frontiers for a given product category (e.g., cars). Based on the design theories, it hypothesizes that the shape of function‐price frontiers are different between integral models and modular models. It then hypothesizes that price‐oriented customers tend to choose relatively modular products, whereas function‐oriented customers choose relatively integral products more often than not, other things being equal. Thus, the macro architecture of a given product can be determined depending on whether each architecture's price‐function frontier touches the price‐function preference curves of its customers. As for the future architecture of the car, its macro architecture, determined by markets and environments, will remain relatively integral and complex as long as it continues to be a fast‐moving heavy artifact in the public space, whereas its micro architecture, determined by engineers, will be somewhat mixed, as the engineers try to simplify and modularize the automobile design wherever the market and technology permit. The evolutionary framework of architectures also predicts that the architectural sequence inside the industry life cycle will differ by products (e.g., cars and computers) depending upon the dynamic patterns of technological advancement (e.g., shifts of the price‐function frontier) and market‐societal constraints (e.g., shifts of the price‐function preference curve).  相似文献   

16.
从配合间隙 (或过盈 )的角度用概率统计法分析孔、轴尺寸分布对机械产品使用寿命的影响 ,并提出了改进措施——改变相配件的非基准件的基本偏差 ,从而适当减小配合件的初始间隙 ,使零件的磨损储备量增加 ,以达到延长机械产品使用寿命的目的  相似文献   

17.
The Indian IT services sector has grown from small beginnings at the bottom of value creation to a major player in the global information and communications technology (ICT) industry. It commands a 55% share in the global market for IT services. India's IT sector value proposition in terms of low cost with large supply of high quality talent is compelling. As a result, India has become the premier choice not only for outsourcing IT services by the developed-world's multinational corporations (MNCs) but also for locating their own Global In-house Centers (GICs), which simultaneously compete and partner with local firms. This gave rise to six additional clusters beyond the earliest, largest and robust cluster, Bangalore. The paper provides a review of relevant literature; develops a conceptual framework for evaluation of clusters; and presents data and analysis with respect to relative size, growth, specialization, MNC presence and connectivity to local firms through expatriates and returning Indians, ,innovation; and discusses adequacy of ICT infrastructure for future growth. Although there are clear signs that the Indian IT sector has been moving towards a regime of providing high-end value added services, the sector's value proposition – lower cost combined with a large supply of high quality talent – remains the single most compelling reason for the rise and growth of multiple export clusters. Thus the sector's growth appears to be a case of growth by replication rather than innovation. The paper concludes that the Indian IT sector's value proposition in terms of lower cost combined with large supply of high quality talent remains the single most compelling reason for the rise and growth of multiple IT services export clusters. While the old adage, “people follow jobs” still holds for large part of the labor force, there is little doubt that the sprawling IT services clusters in India - with more to come from Tier II and Tier III cities – indicate, in fact, that “jobs follow talent." Both local firms and the MNCs, through their GICs, are pushing the boundaries of location farther and farther to continue to leverage cost advantage and available pools of talent.  相似文献   

18.
If a publicly-owned firm has a higher marginal cost than a private firm, partial public ownership may be welfare-improving, if the public firm acts is Stackelberg leader. If the private firm's marginal cost is private information a simple transfer function is truth-eliciting. If the stock market is efficient, the cost of renationalization is “small”.  相似文献   

19.
The question of when and in what quantity to obtain required bond financing takes on significance due to fixed flotation coats and the opportunity to invest excess long-term debt in marketable securities. In prior research, the optimal size and timing of bond issues is most often determined as the solution to a cost minimization problem. In this paper, we first demonstrate that previous studies have incorrectly identified the opportunity cost of excess debt. We then delineate the opportunity cost in a form consistent with modern capital structure theory. Our results indicate that a firm's capital structure and marginal tax rate are important determinants of the optimal size and timing of its bond issues.  相似文献   

20.
This article estimates the parameters of a cost function for the process of gas transmission based on the two basic capital inputs to the process: pipe and compressors. This in turn allows us to assess the combination of capital, operating, and maintenance costs that minimize the total cost of a natural gas transportation system. We further show that the industry's production technology exhibits increasing returns to scale. That is, we find that the long-run marginal cost is lower than the long-run average cost per unit. The natural gas transmission cost function derived is consistent with the engineering aspects of the industry and may be used to find the minimal cost of a system to transport natural gas.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号