首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.
文章以ZLG协议栈为例,讨论基于EasyARM2200实验平台的嵌入式TCP/IP协议栈的实现方法及过程,着重而又详细地分析和说明了在此嵌入式系统中对TCP协议的处理方法.并通过ARM实验平台与上位机之间TCP/IP协议的通信来对嵌入式TCP/IP协议栈的实现进行验证.  相似文献   

2.
高振强 《价值工程》2011,30(34):162-164
无线网状网作为一种新型的网络技术已经得到学术界和业界普遍关注和认可,也逐渐成为实现新一代互联网无线宽带接入层的基本网络技术。ARM平台作为通用式的嵌入式平台,其低功耗和高速的数据处理能力越来越受到无线通信领域的青睐。本文在论述了无线网状网的起源、定义、结构和特点的基础上,基于s3c2440的ARM平台设计了无线网状网的网络节点和路由器的硬件平台,并在ARM平台上实现了无线网状网协议,构建了无线网状网节点。  相似文献   

3.
邵华 《价值工程》2013,(32):52-53
本文介绍了一种基于ARM的嵌入式打孔机系统的设计方案,该方案采用ARM11作为主控芯片,对模拟CCD进行图像采集,然后通过图像算法进行圆心定位,最后驱动步进电机进行定位与冲孔。经过现场整机测试,该机的各项性能指标均达到了预期设计目标。  相似文献   

4.
文章以ZLG协议栈为例,讨论基于EasyARM2200实验平台的嵌入式TCP/IP协议栈的实现方法及过程,着重而又详细地分析和说明了在此嵌入式系统中对TCP协议的处理方法。并通过ARM实验平台与上位机之间TCP/IP协议的通信来对嵌入式TCP/IP协议栈的实现进行验证。  相似文献   

5.
ARM在未来对x86的威胁是比较明显的。从低端市场开始,ARM已经在逐渐地侵蚀上网本等类似设备的市场份额,随着ARM性能的不断提升,需要ARM的市场还有很多。ARM目前表现出极高的性能功耗比和整个平台极为优秀的适用性是其获得用户青睐、取得成功的重要因素,在未来很长一段时间中,这个优势都会帮助ARM拓展市场并更为强大。  相似文献   

6.
文章讨论了ARM9内核处理器S3C2410与CAN控制器SJA1000的总线接口的差异性,提出了SJA1000和S3C2410接口设计方案。该方案成本低,性能稳定,可以作为其他ARM及DSP的此类接口设计参考。  相似文献   

7.
周昉  徐军 《价值工程》2015,(14):181-183
针对无线突发模式的自组网通信系统,设计了基于FPGA和ARM的自组网物理平台实现方案,FPGA处理收发端的基带信号,ARM进行协议处理;根据主要芯片的选型,设计了主要模块之间的电路接口。实践表明,该方案能够完成自组网通信网络的要求。  相似文献   

8.
基于μC/OSⅡ的数据采集系统的设计   总被引:1,自引:0,他引:1  
本文介绍了一种数据采集系统的设计,该系统利用ARM7作为控制核心,并且将嵌入式操作系统μC/OSⅡ植入,进行系统软件设计。  相似文献   

9.
针对目前国内水文数据采集器的不足,给出了一种基于ARM嵌入式处理器的多通道水文数据采集系统的设计方法。通过多通道模拟信号的采集,发挥了嵌入式系统在水文数据采集系统中的运用优势。  相似文献   

10.
本文提出了一种基于嵌入式技术的视频监控系统设计的具体方案.系统中以S3C2510A为处理器核心,ARM Linux为操作系统平台,构建了嵌入式视频监控系统的软硬件平台,简要介绍了其硬件结构,并详细阐述了软件系统的设计与实现.  相似文献   

11.
In this paper, we introduce a new flexible mixed model for multinomial discrete choice where the key individual- and alternative-specific parameters of interest are allowed to follow an assumption-free nonparametric density specification, while other alternative-specific coefficients are assumed to be drawn from a multivariate Normal distribution, which eliminates the independence of irrelevant alternatives assumption at the individual level. A hierarchical specification of our model allows us to break down a complex data structure into a set of submodels with the desired features that are naturally assembled in the original system. We estimate the model, using a Bayesian Markov Chain Monte Carlo technique with a multivariate Dirichlet Process (DP) prior on the coefficients with nonparametrically estimated density. We employ a “latent class” sampling algorithm, which is applicable to a general class of models, including non-conjugate DP base priors. The model is applied to supermarket choices of a panel of Houston households whose shopping behavior was observed over a 24-month period in years 2004–2005. We estimate the nonparametric density of two key variables of interest: the price of a basket of goods based on scanner data, and driving distance to the supermarket based on their respective locations. Our semi-parametric approach allows us to identify a complex multi-modal preference distribution, which distinguishes between inframarginal consumers and consumers who strongly value either lower prices or shopping convenience.  相似文献   

12.
The practical relevance of several concepts of exogeneity of treatments for the estimation of causal parameters based on observational data are discussed. We show that the traditional concepts, such as strong ignorability and weak and super-exogeneity, are too restrictive if interest lies in average effects (i.e. not on distributional effects of the treatment). We suggest a new definition of exogeneity, KL-exogeneity. It does not rely on distributional assumptions and is not based on counterfactual random variables. As a consequence it can be empirically tested using a proposed test that is simple to implement and is distribution-free.  相似文献   

13.
The objective of this paper is to integrate the generalized gamma (GG)(GG) distribution into the information theoretic literature. We study information properties of the GGGG distribution and provide an assortment of information measures for the GGGG family, which includes the exponential, gamma, Weibull, and generalized normal distributions as its subfamilies. The measures include entropy representations of the log-likelihood ratio, AIC, and BIC, discriminating information between GGGG and its subfamilies, a minimum discriminating information function, power transformation information, and a maximum entropy index of fit to histogram. We provide the full parametric Bayesian inference for the discrimination information measures. We also provide Bayesian inference for the fit of GGGG model to histogram, using a semi-parametric Bayesian procedure, referred to as the maximum entropy Dirichlet (MED). The GGGG information measures are computed for duration of unemployment and duration of CEO tenure.  相似文献   

14.
This paper analyzes the higher-order properties of the estimators based on the nested pseudo-likelihood (NPL) algorithm and the practical implementation of such estimators for parametric discrete Markov decision models. We derive the rate at which the NPL algorithm converges to the MLE and provide a theoretical explanation for the simulation results in Aguirregabiria and Mira [Aguirregabiria, V., Mira, P., 2002. Swapping the nested fixed point algorithm: A class of estimators for discrete Markov decision models. Econometrica 70, 1519–1543], in which iterating the NPL algorithm improves the accuracy of the estimator. We then propose a new NPL algorithm that can achieve quadratic convergence without fully solving the fixed point problem in every iteration and apply our estimation procedure to a finite mixture model. We also develop one-step NPL bootstrap procedures for discrete Markov decision models. The Monte Carlo simulation evidence based on a machine replacement model of Rust [Rust, J., 1987. Optimal replacement of GMC bus engines: An empirical model of Harold Zurcher. Econometrica 55, 999–1033] shows that the proposed one-step bootstrap test statistics and confidence intervals improve upon the first order asymptotics even with a relatively small number of iterations.  相似文献   

15.
In this paper, we introduce a new Poisson mixture model for count panel data where the underlying Poisson process intensity is determined endogenously by consumer latent utility maximization over a set of choice alternatives. This formulation accommodates the choice and count in a single random utility framework with desirable theoretical properties. Individual heterogeneity is introduced through a random coefficient scheme with a flexible semiparametric distribution. We deal with the analytical intractability of the resulting mixture by recasting the model as an embedding of infinite sequences of scaled moments of the mixing distribution, and newly derive their cumulant representations along with bounds on their rate of numerical convergence. We further develop an efficient recursive algorithm for fast evaluation of the model likelihood within a Bayesian Gibbs sampling scheme. We apply our model to a recent household panel of supermarket visit counts. We estimate the nonparametric density of three key variables of interest-price, driving distance, and their interaction-while controlling for a range of consumer demographic characteristics. We use this econometric framework to assess the opportunity cost of time and analyze the interaction between store choice, trip frequency, search intensity, and household and store characteristics. We also conduct a counterfactual welfare experiment and compute the compensating variation for a 10%-30% increase in Walmart prices.  相似文献   

16.
The sample mean is one of the most natural estimators of the population mean based on independent identically distributed sample. However, if some control variate is available, it is known that the control variate method reduces the variance of the sample mean. The control variate method often assumes that the variable of interest and the control variable are i.i.d. Here we assume that these variables are stationary processes with spectral density matrices, i.e. dependent. Then we propose an estimator of the mean of the stationary process of interest by using control variate method based on nonparametric spectral estimator. It is shown that this estimator improves the sample mean in the sense of mean square error. Also this analysis is extended to the case when the mean dynamics is of the form of regression. Then we propose a control variate estimator for the regression coefficients which improves the least squares estimator (LSE). Numerical studies will be given to see how our estimator improves the LSE.  相似文献   

17.
The generalised method of moments estimator may be substantially biased in finite samples, especially so when there are large numbers of unconditional moment conditions. This paper develops a class of first-order equivalent semi-parametric efficient estimators and tests for conditional moment restrictions models based on a local or kernel-weighted version of the Cressie–Read power divergence family of discrepancies. This approach is similar in spirit to the empirical likelihood methods of Kitamura et al. [2004. Empirical likelihood-based inference in conditional moment restrictions models. Econometrica 72, 1667–1714] and Tripathi and Kitamura [2003. Testing conditional moment restrictions. Annals of Statistics 31, 2059–2095]. These efficient local methods avoid the necessity of explicit estimation of the conditional Jacobian and variance matrices of the conditional moment restrictions and provide empirical conditional probabilities for the observations.  相似文献   

18.
The technique of Monte Carlo (MC) tests [Dwass (1957, Annals of Mathematical Statistics 28, 181–187); Barnard (1963, Journal of the Royal Statistical Society, Series B 25, 294)] provides a simple method for building exact tests from statistics whose finite sample distribution is intractable but can be simulated (when no nuisance parameter is involved). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing it to statistics whose null distribution involves nuisance parameters [maximized MC (MMC) tests]. Simplified asymptotically justified versions of the MMC method are also proposed: these provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics.  相似文献   

19.
We propose a finite sample approach to some of the most common limited dependent variables models. The method rests on the maximized Monte Carlo (MMC) test technique proposed by Dufour [1998. Monte Carlo tests with nuisance parameters: a general approach to finite-sample inference and nonstandard asymptotics. Journal of Econometrics, this issue]. We provide a general way for implementing tests and confidence regions. We show that the decision rule associated with a MMC test may be written as a Mixed Integer Programming problem. The branch-and-bound algorithm yields a global maximum in finite time. An appropriate choice of the statistic yields a consistent test, while fulfilling the level constraint for any sample size. The technique is illustrated with numerical data for the logit model.  相似文献   

20.
This paper applies the model confidence set (MCS) procedure of Hansen, Lunde and Nason (2003) to a set of volatility models. An MCS is analogous to the confidence interval of a parameter in the sense that it contains the best forecasting model with a certain probability. The key to the MCS is that it acknowledges the limitations of the information in the data. The empirical exercise is based on 55 volatility models and the MCS includes about a third of these when evaluated by mean square error, whereas the MCS contains only a VGARCH model when mean absolute deviation criterion is used. We conduct a simulation study which shows that the MCS captures the superior models across a range of significance levels. When we benchmark the MCS relative to a Bonferroni bound, the latter delivers inferior performance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号