title
stringlengths
0
1.91k
abstract
stringlengths
0
17k
keywords
stringlengths
0
7.81k
source_name
stringclasses
6 values
null
With the advancement of power electronics, new materials and novel bearing technologies, there has been an active development of high speed machines in recent years. The simple rotor structure makes switched reluctance machines (SRM) candidates for high speed operation. This paper has presents the design of a low power, 50,000 RPM 6/4 SRM having a toroidally wound stator. Finite element analysis (FEA) shows an equivalence to conventionally wound SRMs in terms of torque capability. With the conventional asymmetric converter and classic angular control, this toroidal-winding SRM (TSRM) is able to produce 233.20 W mechanical power with an efficiency of 75% at the FEA stage. Considering the enhanced cooling capability as the winding is directly exposed to air, the toroidal winding is a good option for high-speed SRM.
toroidal winding;high speed;switched reluctance machine;experimental validation
WoS
null
With the advances of stem cell research, development of intelligent biomaterials and three-dimensional biofabrication strategies, highly mimicked tissue or organs can be engineered. Among all the biofabrication approaches, bioprinting based on inkjet printing technology has the promises to deliver and create biomimicked tissue with high throughput, digital control, and the capacity of single cell manipulation. Therefore, this enabling technology has great potential in regenerative medicine and translational applications. The most current advances in organ and tissue bioprinting based on the thermal inkjet printing technology are described in this review, including vasculature, muscle, cartilage, and bone. In addition, the benign side effect of bioprinting to the printed mammalian cells can be utilized for gene or drug delivery, which can be achieved conveniently during precise cell placement for tissue construction. With layer-by-layer assembly, three-dimensional tissues with complex structures can be printed using converted medical images. Therefore, bioprinting based on thermal inkjet is so far the most optimal solution to engineer vascular system to the thick and complex tissues. Collectively, bioprinting has great potential and broad applications in tissue engineering and regenerative medicine. The future advances of bioprinting include the integration of different printing mechanisms to engineer biphasic or triphasic tissues with optimized scaffolds and further understanding of stem cell biology.
3D-Printing;Biomaterials;Bioprinting;Bone;Cartilage;Muscle;Stem cells;Tissue engineering
WoS
null
With the advent in technology, wireless sensor networks are used to gather information and monitor data in the inaccessible and remote areas where wired network is not feasible. Due to limited power back-up, sensor network cannot be deployed on wider scale. Processing is recommended within network to reduce communication overhead and energy consumption. Low power analog processing circuits integrated within sensor nodes are suitable for this task. This paper reviews analog circuitries for processing of data to improve energy efficiency in addition to low power networking protocols and security solutions.
Sensor networks;analog signal processing;filters;microcontroller;routing protocols
WoS
null
with the advent of agile software processing methods and needs to write the programs and software with the best quality in the shortest possible time, unorthodox practices in computer programming are becoming more and more common. One of these practices is pair programming characterized by two programmers sharing the same computer for collaborative programming purposes. Pair programming implies a psychological and social interaction between the participating programmers. The goal of this paper is an evaluation of the pair programming to determine the influence of programmers' personality and problem difficulty on efficiency of pairs. An agent-based system is used to simulate a pair and Myers-Briggs Type Indicator (MBTI) is used to measure personality of each member of the pairs. This paper presents, suggests, and evaluates the role of personality in formation and the utility of a pair.
Pair programming;Personality;Collaborative programming;Multi-Agent System;Simulation;Myers-Briggs Type Indicator (MBTI)
WoS
null
With the advent of Component-based software engineering (CBSE), large software systems are being built by integrating pre-built software components. The Semantic Web in association with CBSE has shown to offer powerful representation facilities and reasoning techniques to enhance and support querying, reasoning, discovery, etc. of software components. The goal of this paper is to research the applicability of Semantic Web technologies in performing the various tasks of CBSE and review the experimental results of the same in an easy and effective manner. To the best of our knowledge, this is the first study which provides an extensive review of the application of Semantic Web in CBSE from different perspectives. A systematic literature review of the Semantic Web approaches, employed for use in CBSE, reported from 2001 until 2015, is conducted in this research article. Empirical results have been drawn through the question-answer based analysis of the research, which clearly tells the year wise trend of the research articles, with the possible justification of the usage of Semantic Web technology and tools for a particular phase of CBSE. To conclude, gaps in the current research and potential future prospects have been discussed. (C) 2016 Elsevier Inc. All rights reserved.
Component-based software engineering;Semantic Web;Ontology;Reasoners;Web services;Linked Data
WoS
null
With the advent of large complex datasets, NOSQL databases have gained immense popularity for their efficiency to handle such datasets in comparison to relational databases. There are a number of NOSQL data stores for e.g. Mongo DB, Apache Couch DB etc. Operations in these data stores are executed quickly. In this paper we aim to get familiar with 2 most popular NoSQL databases: Elasticsearch and Apache CouchDB. This paper also aims to analyze the performance of Elasticsearch and CouchDB on image data sets. This analysis is based on the results carried out by instantiate, read, update and delete operations on both document-oriented stores and thus justifying how CouchDB is more efficient than Elasticsearch during insertion, updation and deletion operations but during selection operation Elasticsearch performs much better than CouchDB. The implementation has been done on LINUX platform
NoSQL;Elasticsearch;Apache CouchDB;Performance Analysis
WoS
null
With the advent of modern technology, the use of biometric authentication systems has been on the rise. The core of any biometric system consists of a database which contains the biometric traits of the successfully enrolled users. As such, maintenance of the security of the database is paramount, i.e. it must be made sure that the contents of the database should not be compromised to foreign threats or adversaries. Biometric encryption (BE) is by far the most successfully studied and analysed technique used for providing this required level of security in biometric systems. In this survey, we discuss the intuition behind this idea and study in deep the key-binding-based mechanisms of BE which will provide a basic foundation for future novel researches in this area. In addition to the latest available survey, our paper investigates in details the core ideas behind the development of the fuzzy frameworks and includes the most recent works in the available literature. This study is concluded by inspecting the merging of multimodal biometrics with the fuzzy systems and discussing some open challenges in this domain.
biometrics (access control);cryptography;fuzzy set theory;database management systems;key-binding-based biometric data protection schemes;biometric authentication systems;database security maintenance;biometric traits;biometric encryption;fuzzy systems
WoS
null
With the advent of new technologies and various services provided in the context of computer networks, a large volume of data is being generated. The main challenge in this area is providing network protection services against various threats and vulnerabilities. So far, many techniques have been proposed to deal with these threats. All of these techniques pursue the same goal, preventing attackers from reaching their objectives. A solution based on early warning system(s) (EWSs) is what exactly security teams need to manage the threats properly. EWS, as a complement to Intrusion Detection System, is a proactive approach against security threats. This is carried out through the early detection of potential behavior of a system, evaluating the scope of malicious behavior, and finally, using suitable response against any kind of detectable security event. This paper presents a comprehensive review on EWSs including definitions, applications, architectures, alert correlation aspects, and other technical requirements. Furthermore, previous studies and existing EWSs have been described and analyzed here. A classification of EWSs has been presented: commercial systems and systems under research and development. Finally, from the studies about EWSs, we conclude some challenges and research issues are still remain open. Copyright (C) 2016 John Wiley & Sons, Ltd.
intrusion detection;intrusion prevention;early warning system;alert correlation;network security
WoS
null
With the advent of Recommendation ITU-R BT.2020 (Rec. 2020) color space for ultra-high definition television (UHDTV), the development of a gamut mapping algorithm for wide to standard gamuts is greatly needed. In wide-gamut UHDTV production, particularly under the assumption of simultaneous live UHDTV and high-definition television (HDTV) broadcast scenarios, real-time gamut mapping from UHDTV to HDTV is vital. Designing such an algorithm requires a careful consideration of the shape and size of significantly different gamuts. In this paper, we set some requirements for gamut mapping from UHDTV to HDTV, and propose a hue-preserved gamut mapping algorithm. The mapping paths are designed in the CIE 1976 L*a*b* (CIELAB) color space through modifications of the hue prediction based on our subjective experimental results on the constant perceived hue. To prevent an excessive chroma reduction, the lightness of the out-of-gamut colors is changed. The design process and mathematical expressions of our proposed gamut mapping algorithm are provided. In addition, we demonstrate the improvements in image quality.
Gamut mapping;high-definition television (HDTV);Rec. ITU-R BT.2020 (Rec. 2020);Rec. ITU-R BT.709 (Rec. 709);ultra-high definition television (UHDTV)
WoS
null
With the advent of the direct acting antivirals (DAA), or all oral HCV treatment regimens, there exists a great opportunity to provide HCV treatment to people who inject drugs (PWID) enrolled in an opioid treatment program (OW). This retrospective study conducted in the context of routine clinical care explores the outcomes of HCV treatment with DAAs in PWID enrolled in an OTP. Our study showed treatment outcomes among our first 75 patients treated with DAAs were nearly equivalent to patients in the general population. Ninety-eight percent of patients completing treatment obtained a sustained virologic response, with 10 patients lost to follow-up. Ninety-nine percent of patients adhered to HCV treatment. Ongoing drug use occurred in 23% of patients, however this did not alter HCV treatment outcomes. Treating HCV infection with DAAs in PWID onsite in an OTP is feasible. (C) 2017 Elsevier Inc. All rights reserved.
Hepatitis C (HCV);People who inject drugs (PWID);Substance use disorder (SUD);Opioid treatment program (OTP);Direct acting antivirals (DAA)
WoS
null
With the aging of the population worldwide, osteoporosis and osteoporotic fractures are becoming a serious health care issue in the Western world. Although less frequent than in women, osteoporosis in men is a relatively common problem. Hip and vertebral fractures are particularly relevant, being associated with significant mortality and disability. Since bone loss and fragility fractures in men have been recognized as serious medical conditions, several randomized controlled trials (RCTs) have been undertaken in males with osteoporosis to investigate the anti-fracture efficacy of the pharmacological agents commonly used to treat postmenopausal osteoporosis. Overall, treatments for osteoporosis in men are less defined than in women, mainly due to the fact that there are fewer RCTs performed in male populations, to the relatively smaller sample sizes, and to the lack of long-term extension studies. However, the key question is whether men are expected to respond differently to osteoporosis therapies than women. The pharmacological properties of bisphosphonates, teriparatide, denosumab, and strontium ranelate make such differentiation unlikely, and available clinical data support their efficacy in men with primary osteoporosis as well as in women. In a series of well-designed RCTs, alendronate, risedronate, zoledronic acid, and teriparatide were demonstrated to reduce the risk of new vertebral fractures in men presenting with primary osteoporosis (including osteoporosis associated with low testosterone levels) and to improve the bone mineral density (BMD). In preliminary studies, ibandronate, denosumab, and strontium ranelate also showed their beneficial effects on surrogate outcomes (BMD and markers of bone turnover) in men with osteoporosis. Although direct evidence about their non-vertebral anti-fracture efficacy are lacking, the effects of bisphosphonates, denosumab, teriparatide, and strontium ranelate on surrogate outcomes (BMD and markers of bone turnover) were similar to those reported in pivotal RCTs undertaken in postmenopausal women, in which vertebral and non-vertebral anti-fracture efficacy have been clearly demonstrated. In conclusion, sufficient data exist to support the use of these pharmacological agents in men with primary osteoporosis. Further RCTs are warranted to establish their long-term efficacy and safety.
bisphosphonates;teriparatide;denosumab;strontium ranelate
WoS
null
With the aid of signal flow graphs, we have analyzed the flux-control distribution in linear metabolic pathways with multiple feedback loops and branched pathways. It is shown that the flux control coefficients of the enzymes in a linear pathway with multiple feedback loops can be evaluate by modifying the signal flow graph of the unregulated pathway in a step-by-step fashion. On the basis of the results obtained with the signal flow graphs, a principle of superposition is suggested for calculating the flux control coefficients of a linear pathway with a general pattern of feedback inhibition. Using this superposition principle, it is possible to determine the flux control coefficients directly from an examination of the topology of the feedback loops in the metabolic pathway, without drawing a signal flow graph. In a branched pathway the control coefficients of the enzymes depend on the fluxes through the various branches in addition to the enzyme elasticities. We show how these fluxes can be incorporated into a signal flow graph from which the flux control coefficients are found. We also describe a systematic procedure for converting a signal flow graph to a simpler form which may significantly reduce the effort necessary for calculating the flux control coefficients. Modifications of a signal flow graph for assessing the relative importance of the enzymes in flux control are also discussed. Based on our findings from the signal flow graphs, we have presented a heuristic method for determining the flux control coefficients directly from the reaction sequence of the pathway, without drawing a signal flow graph. The present analysis applies to metabolic pathways in a steady state.
METABOLIC REGULATION;THEORETICAL MODEL;GRAPH-THEORETIC ANALYSIS;FLUX CONTROL COEFFICIENT
WoS
null
With the aid of symbolic computation by Maple, a new integrable generalization of the classical Wadati-Konno-Ichikawa hierarchy is derived from a corresponding matrix spectral problem associated with the Lie algebra sl(2, R). Each equation in the resulting hierarchy has a bi-Hamiltonian structure furnished by the trace identity, and possesses infinitely many independent commuting symmetries and conservation laws. (C) 2014 Elsevier B.V. All rights reserved.
Soliton hierarchy;Matrix spectral problem;Lie algebra sl(2 R);Hamiltonian structure;Symbolic computation
WoS
null
With the aid of symbolic computation Maple, the discrete Ablowitz Ladik equation is studied via an algebra method, some new rational solutions with four arbitrary parameters are constructed. By analyzing related parameters, the discrete rogue wave solutions with alterable positions and amplitude for the focusing Ablowitz Ladik equations are derived. Some properties are discussed by graphical analysis, which might be helpful for understanding physical phenomena in optics.
symbolic computation Maple;Ablowitz-Ladik equation;rational solutions;discrete rogue wave solutions
WoS
null
With the aid of theoretical calculations, a series of molecularly imprinted polymers (MIPs) were designed and prepared for the recognition of dicyandiamide (DCD) via precipitation polymerization using acetonitrile as the solvent at 333 K. On the basis of the long-range correction method of M062X/6-31G(d,p), we simulated the bonding sites, bonding situations, binding energies, imprinted molar ratios, and the mechanisms of interaction between DCD and the functional monomers. Among acrylamide (AM), N, N'-methylenebisacrylamide (MBA), itaconic acid (IA), and methacrylic acid (MAA), MAA was confirmed as the best functional monomer, because the strongest interaction (the maximum number of hydrogen bonds and the lowest binding energy) occurs between DCD and MAA, when the optimal molar ratios for DCD to the functional monomers were used, respectively. Additionally, pentaerythritol triacrylate (PETA) was confirmed to be the best cross-linker among divinylbenzene (DVB), ethylene glycol dimethacrylate (EGDMA), trimethylolpropane trimethylacrylate (TRIM), and PETA. This is due to the facts that the weakest interaction (the highest binding energy) occurs between PETA and DCD, and the strongest interaction (the lowest binding energy) occurs between PETA and MAA. Depending on the results of theoretical calculations, a series of MIPs were prepared. Among them, the ones prepared using DCD, MAA, and PETA as the template, the functional monomer, and the cross-linker, respectively, exhibited the highest adsorption capacity for DCD. The apparent maximum absorption quantity of DCD on the MIP was 17.45 mg/g.
dicyandiamide;molecular imprinting;molecularly imprinted polymer;computer simulation
WoS
null
With the aim of improving detection of novel single-nucleotide polymorphisms (SNPs) in genetic association studies, we propose a method of including prior biological information in a Bayesian shrinkage model that jointly estimates SNP effects. We assume that the SNP effects follow a normal distribution centered at zero with variance controlled by a shrinkage hyperparameter. We use biological information to define the amount of shrinkage applied on the SNP effects distribution, so that the effects of SNPs with more biological support are less shrunk toward zero, thus being more likely detected. The performance of the method was tested in a simulation study (1,000 datasets, 500 subjects with approximate to 200 SNPs in 10 linkage disequilibrium (LD) blocks) using a continuous and a binary outcome. It was further tested in an empirical example on body mass index (continuous) and overweight (binary) in a dataset of 1,829 subjects and 2,614 SNPs from 30 blocks. Biological knowledge was retrieved using the bioinformatics tool Dintor, which queried various databases. The joint Bayesian model with inclusion of prior information outperformed the standard analysis: in the simulation study, the mean ranking of the true LD block was 2.8 for the Bayesian model versus 3.6 for the standard analysis of individual SNPs; in the empirical example, the mean ranking of the six true blocks was 8.5 versus 9.3 in the standard analysis. These results suggest that our method is more powerful than the standard analysis. We expect its performance to improve further as more biological information about SNPs becomes available.
Bayesian model;genetic association studies;prior knowledge;shrinkage
WoS
null
With the Amazon EC2 Cloud becoming available as a viable platform for parallel computing, Earth System Models are increasingly interested in leveraging its capabilities towards improving climate projections. In particular, faced with long wait periods on high-end clusters, the elasticity of the Cloud presents a unique opportunity of potentially "infinite" availability of small-sized clusters running on high-performance instances. Among specific applications of this new paradigm, we show here how uncertainty quantification in climate projections of polar ice sheets (Antarctica and Greenland) can be significantly accelerated using the Cloud. Indeed, small-sized clusters are very efficient at delivering sensitivity and sampling analysis, core tools of uncertainty quantification. We demonstrate how this approach was used to carry out an extensive analysis of ice-flow projections on one of the largest basins in Greenland, the North-East Greenland Glacier, using the Ice Sheet System Model, the public-domain NASA-funded ice-flow modeling software. We show how errors in the projections were accurately quantified using Monte-Carlo sampling analysis on the EC2 Cloud, and how a judicious mix of high-end parallel computing and Cloud use can best leverage existing infrastructures, and significantly accelerate delivery of potentially ground-breaking climate projections, and in particular, enable uncertainty quantification that were previously impossible to achieve. (C) 2016 Elsevier Ltd. All rights reserved.
Polar;Ice sheet;Modeling;Cloud;Uncertainty quantification
WoS
null
With the amount of Internet traffic increasing substantially, measuring per-flow traffic accurately is an important task. Because of the nature of high-speed routers, a measurement algorithm should be fast enough to process every packet going through them, and should be executable with only a limited amount of memory, as well. In this paper, we use two techniques to solve memory/speed constraints: (1) recycling a memory block by resetting it (for memory constraint), and (2) confinement of virtual vectors to one word (for speed constraint). These techniques allow our measurement algorithm, called a recyclable counter with confinement (RCC), to accurately measure all individual flow sizes with a small amount of memory. In terms of encoding speed, it uses about one memory access and one hash computation. Unlike other previously proposed schemes, RCC decodes very quickly, demanding about three memory accesses and two hash calculations. This fast decoding enables real-time detection of a high uploader/downloader. Finally, RCC's data structure includes flow labels for large flows, so it is possible to quickly retrieve a list of large-flow names and sizes.
Computer networks;network traffic measurement;computer network security;system analysis and design
WoS
null
With the application of conditional free shipping policy becoming more and more extensive, there are more and more research theories for it. How to confirm free shipping conditions and shipping cost is becoming an important problem faced by e-commerce enterprises. Make example analysis for a series of solutions generated based on algorithm design and C# program design solving model and attain some conclusions: when the product price is within the psychological expectation range of consumers for this product, free shipping threshold is set at the boundary of all product price portfolios and situation of retailer profit jump will happen, in addition, the optimal free shipping threshold should be set around the mean value where the consumer expectation distributes. Of course, the pricing strategy of the optimal delivery service in enterprise is also affected by various factors; its forms are complicated and depend on the specific value of parameters.
E-commerce;Pricing Strategy;Supplier;Competition;Doctor;Cooperation;Coordination
WoS
null
With the application of digital control, the problem of one-step-delay appears. It limits the achievable control bandwidth. Compared with one-step-delay, minimized delay can achieve superior performance. However, it leads to the low-frequency aliasing phenomenon because sampling is not happened in the middle of either the turn-on or the turn-off times of IGBT. To weaken the influence of aliasing and reduce the total harmonics distortion (THD) of output current, the strategy of reducing the inductor current feedback coefficient by adjusting non-dominant poles is proposed. Simulation and experimental results are provided to verify the feasibility and effectiveness of the scheme.
State space;minimizing computational delay;low-frequency aliasing;poles placement
WoS
null
With the application of network technology, the risk of network security is gradually increasing. In order to predict the likelihood of network risks in real-time, a Time-Varying Markov Model (TVMM) for real-time risk probability prediction was proposed. The real-time risk probability prediction method is able to predict the probability of network risk in future exactly with a real-time-updating-state probability transition matrix of TVMM. The model is used to calculate the risk probability of the network at different risk levels in network attack environment. The result shows that TVMM has higher real-time objectivity and accuracy than the traditional Markov model.
Safety risk prediction;Time-Varying Markov Model (TVMM);Network attack
WoS
null
With the Brazilian market to international industrial products, it is necessary to check the quality conformity of these products in order to protect the consumer, encouraging continuous quality improvement and promote competition between manufactures [1]. The Compact fluorescent lamps (CFL's or LFC's) are one of the several products that the National Institute of metrology, quality and technology (INMETRO) regulates the quality. One of the LFC's testing stages is to keep the LFC's supplied with electric power for a time period and after this period switch off the LFC's in according to a predefined schedule. The construction of the LFC's supplied circuit is done using semiconductors, and these electronics components can cause an undesirable effect (harmonics) in electrical network. These effects cause a variety of problems in driving, conduction and stabilization laboratory circuits. This paper presents an analysis and suggests a solution to the problem of energy quality in one of the laboratories accredited by INMETRO.
Harmonics;Energy Quality;Lighting
WoS
null
With the broad use of business process management technology, there are more and more business process models. Since the ability of different modelers is different, the quality of these models varies. A question arises here is that, can we refactor these models to improve the quality as practised in software engineering? Business process modeling can be regarded as declarative programming, and business process models can be used to drive the process aware information systems, which are generally developed with model driven architecture, so business process models are crucial for the efficiency of process aware information systems. In this paper, we propose a novel approach on how to systematically refactor business process models with parallel structures for sequence structures for the first time. More specifically, we analyze the real causal relations between business tasks based on data operation dependency analysis, and refactor business process models with process mining technology. After comprehensive model refactoring, parallel execution of business tasks can be maximized, so the efficiency of business processing can be improved, that is, the quality of business process models can be improved. Analysis and experiments show that our approach is effective and efficient.
Business process;model;refactor;parallel
WoS
null
With the coming of the big data era, the data processing in large scale comes out with a new challenge. However, string matching still plays an important role in the network security and information retrieval fields, because of the large size of pattern set with the overhead of memory and access memory time. Improving the string matching algorithm to adapt to the large scale tasks is desirable and meaningful. In this paper, we present and implement a parallel algorithm of multiple string matching based on multi-core platform. In addition, this work focuses on the partition of pattern set by using genetic algorithm through the internal relation of the patterns to reduce the memory overhead and execution performance. Compared with the classical ones, our experiments on both high and low hit-rate data demonstrate that the performance of algorithm enhances about on average by 20%-40% in general. Besides, the proposed algorithm reduces the memory cost on average by 4%-20%.
Parallel;multiple string matching;multi-core;genetic algorithm
WoS
null
With the constant development of economy in our country in recent years, the role of network technology in contemporary society has become more and more prominent. Network has become an indispensable important information tool for modern people's daily life and work, while cloud computing technology as a new kind of computer technology although can provide convenient service for people, at the same time, it also has hidden danger in network security. Therefore, to strengthen the construction of enterprise network security management system under cloud computing technology is particularly important. This article elaborates the important role cloud computing technology plays in the safe operation of enterprise network security management system, exploring the specific role computing technology plays in the construction of enterprise network security management system and putting forward strategies to further perfect enterprise network security management system under computing technology based on the overview of cloud computing.
Cloud computing technology;Enterprise;Network security management system
WoS
null
With the continuing growth of renewable penetration in power systems, it becomes increasingly challenging to manage the operational uncertainty at near-real-time stage via deterministic scheduling approaches. This paper explores the necessity, benefits and implementability of applying stochastic programming to security constrained economic dispatch (SCED). We formulate a stochastic look-ahead economic dispatch (LAED-S) model for near-real-time power system operation. A concept of uncertainty responses is introduced to assess the power system economic risk with respect to net load uncertainties. This concept offers the system operator a simple yet effective gauge to decide whether a stochastic approach is more desirable than a deterministic one. For an efficient stochastic dispatch algorithm, an innovative hybrid computing architecture is proposed. It leverages the progressive hedging algorithm and the L-shaped method. Numerical experiments are conducted on a practical 5889-bus system to illustrate the effectiveness of the proposed approach.
L-shaped method;optimal size reduction;parallel computing;power system economic risk;progressive hedging algorithm;renewable integration;stochastic look-ahead dispatch;uncertainty response
WoS
null
With the continuous development of computer technology and network technology, there has been massive data information, the world has moved from the era of data to the era of big data. The arrival of the era of big data to the computer information processing technology has brought a very big impact on the existing computer information processing technology is difficult to meet the massive data processing needs, at the same time easy access to massive data, it also brings a certain amount of data security issues. Article on the concept of computer graphics information processing technology and big data defines, and then analyzes the opportunities and challenges for the next era of big data computer graphics information processing technology facing Finally, the development direction of the next big data era of computer graphics information processing technology will be discussed.
Computer Graphics Processing Technology;Innovation;Big Data Era
WoS
null
With the continuous development of higher education enterprise and improvement of scientific research conditions, the quality and quantity of large instruments and equipment in colleges and universities have been greatly improved. However, the problem of low efficiency of high quality resources has not been well solved. Problems existed in the construction, management, and utilization of open sharing system of large instruments in colleges are found. In allusion to these problems, the corresponding solutions are doped out. More importantly, it can be used as reference in construction of the opening sharing system of large instruments in other colleges. The practice of opening sharing system of large instruments promotes effective utilization of technological resources in colleges. Making use of advanced technology, intensive operation, humanized service, and networked management will be significant in the construction of a better open sharing system of large instruments in colleges. This paper, taking Harbin Institute of Technology as an example, summarizes the process of its construction of sharing platform of large precise instruments, analyzes the problems in such construction, and eventually finds out some measures to construct the open sharing system of large instruments in colleges.
open sharing system;large instruments and equipment;technological resources
WoS
null
With the continuous development of peoples' life, the skin-care is attracting more attention. There are so many factors affecting the skin condition, and there is nonlinear correlation between factors. However, the current research always use single index to evaluate skin condition which would not be accurate. This article is to introduce seven key factors which affect the skin conditions into the evaluation. The skin condition classification model is built by the BP neural network according to the skin condition index. Firstly, on the basis of age, the skin condition is classified into three type: youth, middle aged, and the old, then the feature of each kind is extracted, and then according to the seven key factors, the neural network calculation is used to study the skin condition, finally, the neural network classification result is compared with the real age to complete the evaluation. The result proves that, by using the method this article introduced in the classification of skin condition, the similarity with the real age can be 70%, BP neural network can evaluate the skin condition effectively. In addition, the method is simple and practical which supplies effective way to do the evaluation. The physical significance of the evaluation is clear and it uses the difference-information in maximum which is the accurate and effective way to obtain the skin condition information.
neural network;classification;the status of human skin;evaluation
WoS
null
With the continuous expansion of single cell biology, the observation of the behaviour of individual cells over extended durations and with high accuracy has become a problem of central importance. Surprisingly, even for yeast cells that have relatively regular shapes, no solution has been proposed that reaches the high quality required for long-term experiments for segmentation and tracking (S&T) based on brightfield images. Here, we present CellStar, a tool chain designed to achieve good performance in long-term experiments. The key features are the use of a new variant of parametrized active rays for segmentation, a neighbourhood-preserving criterion for tracking, and the use of an iterative approach that incrementally improves S&T quality. A graphical user interface enables manual corrections of S&T errors and their use for the automated correction of other, related errors and for parameter learning. We created a benchmark dataset with manually analysed images and compared CellStar with six other tools, showing its high performance, notably in long-term tracking. As a community effort, we set up a website, the Yeast Image Toolkit, with the benchmark and the Evaluation Platform to gather this and additional information provided by others.
image analysis;segmentation and tracking;parameter learning;imaging benchmark
WoS
null
With the degree of parallelism increasing, performance of multi-threaded shared variable applications is not only limited by serialized critical section execution, but also by the serialized competition overhead for threads to get access to critical section. As the number of concurrent threads grows, such competition overhead may exceed the time spent in critical section itself, and become the dominating factor limiting the performance of parallel applications. In modern operating systems, queue spinlock, which comprises a low-overhead spinning phase and a high-overhead sleeping phase, is often used to lock critical sections. In the paper, we show that this advanced locking solution may create very high competition overhead for multithreaded applications executing in NoC-based CMPs. Then we propose a software-hardware cooperative mechanism that can opportunistically maximize the chance that a thread wins the critical section access in the low-overhead spinning phase, thereby reducing the competition overhead. At the OS primitives level, we monitor the remaining times of retry (RTR) in a thread's spinning phase, which reflects in how long the thread must enter into the high-overhead sleep mode. At the hardware level, we integrate the RTR information into the packets of locking requests, and let the NoC prioritize locking request packets according to the RTR information. The principle is that the smaller RTR a locking request packet carries, the higher priority it gets and thus quicker delivery. We evaluate our opportunistic competition overhead reduction technique with cycle-accurate full-system simulations in GEM5 using PARSEC (11 programs) and SPEC OMP2012 (14 programs) benchmarks. Compared to the original queue spinlock implementation, experimental results show that our method can effectively increase the opportunity of threads entering the critical section in low-overhead spinning phase, reducing the competition overhead averagely by 39.9% (maximally by 61.8%) and accelerating the execution of the Region-of-Interest averagely by 14.4% (maximally by 24.5%) across all 25 benchmark programs.
Critical Section;CMP;NoC;OS
WoS
null
With the development and popularization of Internet, computer network has been widely used in various trades and fields. Computer network has been used frequently in daily office, management, life and service. For example, Government departments, institutions, enterprises, etc., the subsequent application of Internet technology has encountered some difficulties in practice, it is necessary for network problems in the application to improve, especially in network maintenance and security risks Therefore, the research and analysis on the maintenance and security management of the network are of great significance to further guarantee the network operation and work. In this paper, the author first describes the concept of network maintenance and security management, and then analyzes the application of network security management technology and the corresponding multi-level protection content; Finally, on this basis, pointed out that the current network maintenance and security management problems, And pointed out the perfect strategy for the new era of network development and maintenance to provide new ideas.
Network;Maintenance;Security;Protection
WoS
null
With the development of advanced manufacturing engineering, selecting supplier scientifically makes for improving competitive power. The evaluative architecture is afforded. Then the method of confirming the supplier evaluation indicator-weight based on the theory of entropy lies in information is given. Further more, the model of supplier selection is build up. Finally, the example is given. It illustrates the method avail to select the supplier to enhance the enterprise competence.
supplier selection;entropy;information theory;synthesized evaluation
WoS
null
With the development of advanced manufacturing engineering, selecting supplier scientifically makes for improving competitive power. This paper is aimed to suggest a methodology leading to effective supplier management processes utilizing information obtained from the supplier selection processes. First the evaluative indicator architecture is afforded. Then, throughout normalizing the value of each indicator, the processed value can be looked as the distribution of a random variable. So the method of confirming the indicator-weight based on the theory of entropy lies in information is given. Further more, the model of supplier selection is build up. Finally, the simulative example is given. It illustrates the method avail to select the supplier to enhance the enterprise. competence.
supplier selection;entropy;information theory;synthesized evaluation
WoS
null
With the development of agriculture, industry and urbanization, land-use and land-cover (LULC) change has resulted in significant deterioration of water quality and severe eutrophication in most of the lakes in China. Plateau lake ecosystem in China is very vulnerable and especially sensitive to environmental change and human disturbance, due to its strong closeness, species simplification, oligotrophy and simple food chain. This research focuses on evaluating the quantitative and spatial relationships of land use pattern and water quality of rivers inflowing to Fuxian Lake, Chinas largest deep freshwater lake in plateau. To investigate the influence of spatial variation in land-use structures and topography on rivers water quality, the distributions of land-use types in the lakes drainage basin were obtained from satellite images, and the correlations between land-use types and inflow water quality indicators were examined by applying statistical analysis and spatial analytical function of Geographic Information System. Subarea-level analysis reveals that a land-use type could exert different effects on water quality in plains and mountains, and the effects had a connection with topographic and hydrologic factors, its mixture with other land-use types, weather conditions during field measurements, as well as its scale. In addition, a comparison of correlation coefficient data for buffer regions of different scales indicated that the effect of land-use type on inflow water quality peaked in buffer regions with a radius between 100 and 200 m. On the whole, the regions within 200 m of river banks were the key regions that affected river water quality, and thus the construction and preservation of a riparian buffer zone in these regions can provide considerable protection from the inputs of non-point source pollutants and nutrients, and important function such as water and soil conservation. Based on these, a pollution control zoning was constructed from two key pollution control zones in the north and south, a phosphate rock pollution control zone in the northeast, a water loss and soil erosion control zone in the east, and a tourism pollution control zone in the west. This research also offers valuable insights into how to carry out subarea-level prevention and control of water pollution and regional development, according to natural environment, land use pattern and characteristics of pollution sources in different pollution control zones. (C) 2016 Elsevier B.V. All rights reserved.
Water quality;Land-use;Topography;Fuxian Lake
WoS
null
With the development of colloid interface and enzyme technologies, enzyme-containing reversed micellar system has been receiving much attention in bioseparation and bioconversion. Because of its high efficiency, it has brought new opportunities for the development of molecular biotechnology. Reversed micelles represent nano-sized aqueous droplets stabilized by surfactant amphiphiles inside the bulk organic solvents. The entrapped enzymes have enhanced activities under those conditions as suited in the lipid bilayers of biological membranes. The fundamentals of enzyme-containing reversed micellar system are described in this paper, with special emphasis on the effects of surfactants varying in concentrations and structures. The latest study progress on the surfactants application in enzyme-containing reversed micelles is reviewed. The introduction of novel functional surfactants in micellar enzymology and their future development are also discussed.
reversed micelle;enzyme;surfactant concentration;surfactant molecular structure;enzyme-containing reversed micellar system
WoS
null
With the development of computer graphics technology, machine vision and virtual reality technology in recent years, 3D reconstruction method through the sequence of images of outdoor scenes has become a key research direction in computer vision and graphics. During the acquisition process of image, due to the measurement equipment and environment, single shot sequence of photos may not be able to extract enough surface information and lead to unable to complete the reconstruction of 3D objects. To solve this problem, fusion method of point clouds from multi-group images is adopted in the thesis. Firstly the color histogram matching is used to complete the supplement image sequence. Next the point cloud from the supplement image sequence is solely calculated. Then the transform parameters in overlap area of different point clouds are computed by using the improved iterative closest point algorithm. Finally the registration and fusion among the point cloud data of different photos is conducted. The experiments show that this method can effectively supplement point cloud data for reconstruction.
3D reconstruction;Iterative Closest Point (ICP) algorithm;Color histogram matching;Point cloud registration and fusion
WoS
null
With the development of computer network technology, more closely the relationship between people and network. The current network security problem has also been gradually into the public's field of vision, actively carried out on the network intrusion detection becomes an important direction of the development of the network security technology. On the basis of the original BP neural network, this paper puts forward an improved algorithm, and applied to network intrusion detection. After the test, the method is better than traditional convergence, better performance.
Intrusion detection;The BP neural network;Network security
WoS
null
With the development of computer network technology, network security issues are also making more and more people pay attention. IPsec protocol as a viable solution to IP communications security was put forward, and many IPsec products were also launched. Therefore, to detect and mine possible vulnerabilities of IPsec-related products is important and necessary. Fuzzing test can help us find these vulnerabilities in these IPsec product implementations. In the current popular Fuzzing tools, Peach is a widely recognized open-source fuzzing framework. However, the current Peach is unable to support IKE protocol. The paper achieved fuzzing test for IKE protocol by extending the framework of the Peach.
Peach;Fuzzing;IPsec;IKE;Frankencert
WoS
null
With the development of digital control technology, sampled-data control shows its prominent superiority for most practical industries. In the framework of sampled-data control, this paper studies the stabilization problem for a class of switched linear neutral systems meanwhile taking into account asynchronous switching. By utilizing the relationship between the sampling period and the dwell time of switched neutral systems, a bond between the sampling period and the average dwell time is revealed to form a switching condition, under which and certain control gains conditions exponential stability of the closed-loop systems is guaranteed. A simple example is given to demonstrate the effectiveness of the proposed method. (C) 2016 Elsevier Ltd. All rights reserved.
Sampled-data control;Switched neutral systems;Asynchronous switching;Stabilization
WoS
null
With the development of digital information technologies, robust watermarking framework is taken into real consideration as a challenging issue in the area of image processing, due to the large applicabilities and its utilities in a number of academic and real environments. There are a wide range of solutions to provide image watermarking frameworks, while each one of them is attempted to address an efficient and applicable idea. In reality, the traditional techniques do not have sufficient merit to realize an accurate application. Due to the fact that the main idea behind the approach is organized based on contourlet representation, the only state-of-the-art materials that are investigated along with an integration of the aforementioned contourlet representation in line with watermarking framework are concentrated to be able to propose the novel and skilled technique. In a word, the main process of the proposed robust watermarking framework is organized to deal with both new embedding and de-embedding processes in the area of contourlet transform to generate watermarked image and the corresponding extracted logo image with high accuracy. In fact, the motivation of the approach is that the suggested complexity can be of novelty, which consists of the contourlet representation, the embedding and the corresponding de-embedding modules and the performance monitoring including an analysis of the watermarked image as well as the extracted logo image. There is also a scrambling module that is working in association with levels-directions decomposition in contourlet embedding mechanism, while a decision maker system is designed to deal with the appropriate number of sub-bands to be embedded in the presence of a series of simulated attacks. The required performance is tangibly considered through an integration of the peak signal-to-noise ratio and the structural similarity indices that are related to watermarked image. And the bit error rate and the normal correlation are considered that are related to the extracted logo analysis, as well. Subsequently, the outcomes are fully analyzed to be competitive with respect to the potential techniques in the image colour models including hue or tint in terms of their shade, saturation or amount of gray and their brightness via value or luminance and also hue, saturation and intensity representations, as long as the performance of the whole of channels are concentrated to be presented. The performance monitoring outcomes indicate that the proposed framework is of significance to be verified.
contourlet based watermarking framework;levels-directions decomposition;embedding process;de-embedding process;peak signal-to-noise ratio;structural similarity indices;normal correlation;bit error rate
WoS
null
With the development of economic globalization, the cultivation of minority secretary major should follow the principle of compound talents training mode and improve the overall quality of the talent of minority secretary major. Secretary major training plan has shortage of teaching content and lacks cross-cultural communication training program. Its teaching mission objectives is not new and secretary professional competence of teachers are not enough and the teaching method is too traditional and curriculum is unreasonable. Based on this situation, the paper does the digitization DRM solutions using the DRM digital control program based on multivariate intelligent teaching reform program of talent training. This paper establishes and develops the ODRL talent model and achieves the segmentation description and dynamic segmentation authorization of tasks and contents of minority secretary major secretary courses. It also achieves the secondary development of DRM under the ODRL control model and verifies the feasibility and reliability of the training mode using the teaching effect evaluation method of OpenAPI, which provides a reliable theoretical basis for the reform of minority secretary major training model.
DRM control model;minority;secretary major;diverse intelligent;ODRL model;OpenAPI authentication
WoS
null
With the development of high technology war. Bionic autonomous underwater vehicle has become a popular issue these years, To Achieve better concealment, the stealth technology is the most difficult problem that must be solved. This paper briefly introduces the principle methods to keep the AUV quiet and difficult to be found, the advantages and disadvantages of all these extraction methods were analyzed. The stealth technology of underwater vehicle refers to the means that make the enemy's sonar can't detect their own location or shorten the distance, mainly including two aspects: noise control and acoustic antagonism. Finally, the development tendency of Bionic AUV stealth technology is provided.
Bionic AUV;Stealth Technology;Noise Control;Acoustic Antagonism;Acoustic Absorption Layer
WoS
null
With the development of home area network, residents have the opportunity to schedule their power usage at the home by themselves aiming at reducing electricity expenses. Moreover, as renewable energy sources are deployed in home, an energy management system needs to consider both energy consumption and generation simultaneously to minimize the energy cost. In this paper, a smart home energy management model has been presented that considers both energy consumption and generation simultaneously. The proposed model arranges the household electrical and thermal appliances for operation such that the monetary expense of a customer is minimized based on the time-varying pricing model. In this model, the home gateway receives the electricity price information as well as the resident desired options in order to efficiently schedule the appliances and shave the peak as well. The scheduling approach is tested on a typical home including variety of home appliances, a small wind turbine, PV panel and a battery over a 24-h period. (C) 2017 Elsevier Ltd. All rights reserved.
Energy management;Appliance scheduling;Smart home;Load shifting;Renewable energy
WoS
null
With the development of information and networking technologies, conventional network has been unable to meet the demands of practical applications and network users. A new network paradigm called Software-Defined Networking (SDN) was proposed and got public attention. By decoupling the forwarding and control planes and applying specific protocols, SDN greatly reduces the cost of network management. Moreover, SDN empowers network managers to program their networks with high flexibility. However, there are many network security issues with regard to SDN, which should be solved in order to ensure the final success of SDN. In this paper, we undertake an SDN security survey. We focus on analyzing SDN's security problems and reviewing existing countermeasures. Meanwhile, we identify the future research directions of SDN security.
Software-Defined Networking;Network security;Network intrusion;Denial of Service;Network management
WoS
null
With the development of integrated circuits, the fully integrated continuous time filter has been focused widely. A Nth-order universal filter based on Current Feedback Operational Amplifier (CFOA) is proposed. Compared with available circuits, the proposed circuit using less components (2n CFOAs, n capacitors, and 3n resistors) can realize the universal filter functions without changing circuit configuration, and the operating frequency of proposed filter circuit is at least 10 MHz. All of capacitors in proposed circuit are grounded, this is another benefit from integration point of view. PSPICE simulations have been carried out using 0.18 mu m CMOS technology, and sensitivity analysis of proposed Nth-order low-pass filter circuit is completed. (C) 2015 Elsevier GmbH. All rights reserved.
Active circuit;Nth-order universal filter;Current feedback operational amplifier;Voltage-mode
WoS
null
With the development of liquid-cooled integrated circuits (ICs) using silicon microchannels, the study of heat transfer and thermal modeling in liquid-cooled heat sinks has gained interest in the last five years. As a consequence, several methodologies on the thermally-aware design of liquid-cooled 2-D/3-D ICs and multiprocessor system-on-chips (MPSoCs) have appeared in the literature. A key component in such methodologies is a fast and accurate thermal modeling technique that can be easily interfaced with design optimization tools. Conventional fully numerical techniques, such as finite-element methods, do not render themselves to enable such an easy interfacing with design tools and their order of complexity is too large for fast simulations. In this context, we present a new semi-analytical representation for heat flow in forced convective cooling inside microchannels, which is continuous in 1-D, i.e., along the direction of the coolant flow. This model is based on the well-known analogy between heat conduction and electrical conduction, and introduces distributed electrical parameters in the dimension considered to be continuous, resulting in a state-space representation of the heat transfer problem. Both steady state and transient semi-analytical models are presented. The proposed semi-analytical model is shown to have a closed-form solution for certain cases that are encountered in practical design problems. The accuracy of the model has been validated against state-of-the-art thermal modeling frameworks [1] (errors << 1%), with 3X speed-up of our proposed modeling framework.
Forced convective cooling;liquid cooling of ICs;thermal modeling
WoS
null
With the development of multifunctional radar and radio frequency (RF) stealth technology, modern radar needs to save as much operating time as possible. During the process of radar target tracking, with interacting multiple models (IMM), this paper proposes an adaptive Markov transition matrix to update the last step for existing radar target tracking algorithms. First, we take interacting multiple models as the main algorithm frame. Then, using gray relation and particle swarm optimization (PSO), multiple-target adaptive sampling interval algorithm is adopted. After the PSO process, we study two methods to update a Markov transition matrix in real time. One is with the ratio of likelihood function, and the other is with the compression ratio of estimation error. Simulations illustrate that our method is effective in reducing operating time for radars.
radar;tracking;Markov matrix
WoS
null
With the development of multimedia computer technology, interactive projection is becoming an interactive display system and entertainment in people's daily life. Interactive projector based on actual needs, design gesture based feature extraction algorithm Hu moments, identified by SVM classification shot type, and ultimately through experiments verify the effectiveness and practicality of the system.
Interactive Projection;Gesture Recognition;Hu moment;SVM
WoS
null
With the development of nanotechnology, understanding of intermolecular interactions on a single molecule level by atomic force spectroscopy (AFM) has played an important role in molecular biology and biomedical science. In recent years, some research suggested that the presence of metal cations is an important regulator in the processes of misfolding and aggregation of the amyloid beta-protein (A beta), which may be an important etiological factor of Alzheimer's disease. However, the knowledge on the principle of interactions between A beta and metal cations at the single molecule level is still poor understood. In this paper, the amyloid beta-protein (A beta) was fabricated on substrate of mixed thiol-modified gold nanoparticles using self-assembled monolayer method and the adhesion force in the longitudinal direction between metal cations and A beta 42 were investigated by AFM. The role of metal ions on A beta aggregation is discussed from the perspective of single molecular force. The force results showed that the specific adhesion force F-i and the nonspecific force F-0 between a single A beta-A beta pair in control experiment were calculated as 42 +/- 3 and 80 pN, respectively. However, Fi between a single A beta-A beta pair in the presence of Cu2+, Zn2+, Ca2+ and Al3+ increased dramatically to 84 +/- 6, 89 +/- 3, 73 +/- 5, 95 +/- 5 pN successively, which indicated that unbinding between A beta proteins is accelerated in the presence of metal cations. What is more, the imaging results showed that substoichiometric copper cations accelerate the formation of fibrils within 3 days. The combined atomic force spectroscopy and imaging analysis indicate that metal cations play a role in promoting the aggregating behavior of A beta 42.
Nanotechnology;Atomic force microscopy (AFM);Interactions;Metal cations;Amyloid beta-protein;Nanoparticles;Biomedicine
WoS
null
With the development of network technology, computer software is developing from the two story structure model to a multi-layer structure. In the multilayer structure, middleware is the key layer of application layer, application software development has become a new technology, and with databases, operating systems form the basis of computer software. With the computer software application environment increasingly complex, design multi-layered architecture, use software layering and modularity that allows software process become clearer and easier to maintain and extend, enhance the flexibility of computer software development and adaptability, thus stratification technology has been widely used in modern computer software development.
Stratification Technology;Computer Software Development;Application Status
WoS
null
With the development of power electronic technology, the control technology of power electronic device is becoming more and more complicated. The switching power supply is a indispensable component in modern power electronic equipment. Its quality and size have a direct impact on the whole performance of electronic equipment. Digital control technology and the applications of FPGA have become a new research hotspot in the field of power electronics. Digital control can reduce the aging of the device and the impact of temperature drift on the accuracy of the convenient. This paper studies the digital switching power supply based on FPGA and points out the system structure of the system.
Digital Switching Power Supply;FPGA;System Design
WoS
null
With the development of science and technology and the formation of a comprehensive multidisciplinary technology. The emergence of new permanent magnetic actuator and electronic operation of synchronous technology in the field of pressure to achieve a hardware basis. In order to meet the high reliability of the vacuum circuit breaker operation, our country is in-depth study, and gradually promote and electronic control systems match the permanent mechanism. In this article, we have the key technology of digital signal processing theory research motor and motor control. Experimental results show the effectiveness of the method, to better reflect the characteristics of the digital control system for digitizing motion brushless DC motor control engineering applications useful to explore and try.
Digital Signal Processing;Motor Control
WoS
null
With the development of science and technology progress, electronic information technology has been accepted by the public, the network information popularization, Internet plus has also become a hot topic, the campus network has become the standard configuration of daily life and study, application of computer and mobile phone also will be more extensive in our life gradually eroded by the network today, network security issues more and more attention from all circles of society, has become one of the main existing problems of campus network. Campus network faces the problem of network security is becoming more and more serious, so how to effectively protect the network security problem, and more become the key problem, set up a firewall, anti-virus software, strengthen identity authentication and data encryption is the effective protection measures. In this paper the present safety hazards and protective measures of campus network are analyzed.
campus network;network security;anti-virus software;firewall;network intrusion
WoS
null
With the development of science and technology, the demand for high quality power supply electrical equipment increasing application of analog circuit Control Research and Application Development of power for many years already, but it still has many shortcomings. FPGA as a highly integrated, economical, high speed, low power consumption, it is easy to develop and maintain a significant advantage (upgrades) and the like. And faster, more integrated, more versatile, and other advantages, stand out from the MCU and DSP integrated circuits, etc., become the research and application in the field of power electronics control hot spots. Briefly introduces the basic theory of digital systems, we analyze the structure model of digital systems. There is an overview of the design method of digital control system, the completion of the selected device selection and related development environment and tools. This paper proposes all-digital power control scheme of special surface treatment, and the completion of the controller hardware and software design and implementation.
Switching Power Supply;Digital Control;FPGA
WoS
null
With the development of the communication technology and the intelligent terminal, the artificial attendance based on the intelligent terminal technology and mobile communication technology is replaced by the attendance and replacement. Based on this, on the basis of existing research on the ALOHA anti-collision strategy, and improved it for mobile positioning attendance. Firstly, the algorithm design, and then gives terminal specific operation, and finally carried out experiments and comparative analysis of simulation results show that the improved ALOHA algorithm outperforms traditional anti-collision algorithm, ensuring the system has a shorter At the same time delay, can effectively improve the throughput performance.
ALOHA;Intelligent terminal;Attendance positioning;Anti collision algorithm
WoS
null
With the development of the design complexity in embedded systems, hardware/software (HW/SW) partitioning becomes a challenging optimization problem in HW/SW co-design. A novel HW/SW partitioning method based on position disturbed particle swarm optimization with invasive weed optimization (PDPSO-IWO) is presented in this paper. It is found by biologists that the ground squirrels produce alarm calls which warn their peers to move away when there is potential predatory threat. Here, we present PDPSO algorithm, in each iteration of which the squirrel behavior of escaping from the global worst particle can be simulated to increase population diversity and avoid local optimum. We also present new initialization and reproduction strategies to improve IWO algorithm for searching a better position, with which the global best position can be updated. Then the search accuracy and the solution quality can be enhanced. PDPSO and improved IWO are synthesized into one single PDPSO-IWO algorithm, which can keep both searching diversification and searching intensification. Furthermore, a hybrid NodeRank (HNodeRank) algorithm is proposed to initialize the population of PDPSO-IWO, and the solution quality can be enhanced further. Since the HW/SW communication cost computing is the most time-consuming process for HW/SW partitioning algorithm, we adopt the GPU parallel technique to accelerate the computing. In this way, the runtime of PDPSO-IWO for large-scale HW/SW partitioning problem can be reduced efficiently. Finally, multiple experiments on benchmarks from state-of-the-art publications and large-scale HW/SW partitioning demonstrate that the proposed algorithm can achieve higher performance than other algorithms.
hardware/software partitioning;particle swarm optimization;invasive weed optimization;communication cost;parallel computing
WoS
null
With the development of the satellite radio navigation system, BeiDou-1 user equipments have found wide application. The capability of those equipments consequently attracts more attention. A practical solution to test those devices is to design a automatic test system aiming at the special test requirement. A simple introduction of BeiDou-1 Navigation System and the test requirement of its user equipments is given, followed by the ATE design in detail, including the system structure, implementation of navigation signal emulator, signal and data process, etc. Test result shows that the ATE presented can meet the requirement to test and verify the performance of the user equipments well.
Automatic Test System;navigation signal emulator;signal and data process
WoS
null
With the development of trusted network, the research of trusted evaluation mechanism of user behavior is a hotpot in the network security. In order to solve the problems of subjectivity, limitations and static in traditional trusted network user behavior evaluation models, we have to find a real-time and dynamic evaluation method for user behavior. In this paper, the authors construct a real-time evaluation mechanism based on double evidence classification of user behavior (DEC-UB). The evaluation mechanism includes the process classification and characteristic classification of user behavior evidence, which makes the user behavior evidence of any time can be directly involved in the trust evaluation, and the evaluation result is more comprehensive and accurate. Simulation experiments have evaluated the three kinds of user behaviors based on the DEC-UB, and compared them with the other two kinds of trust evaluation methods of user behavior, the results show that the proposed methods can evaluate the user's behavior comprehensively, accurately and dynamically in complex network environments, and the results are more realistic.
Trusted network;user behavior;trusted evaluation;evidence classification;DEC-UB
WoS
null
With the direction of the study outcomes and job requirements for postgraduate students, outcome-based education is implemented in the curriculum group design and application for control engineering discipline in Wuhan University of Science and Technology, which is based on the demand analysis of master of full-time professional degree. Combined with the demand of society and enterprises, the system structure of the curriculum group pays more emphasis on the practical ability of student, and it is driven by the students' final study outcome. The curriculum group for control engineering discipline has three blocks: basic curriculums, core curriculums and expanding curriculums and the educational quality is evaluated by the variety of student outcomes.
Outcome-Based Education;Full-Time Professional Degree;Construction of Curriculum Group;Control Engineering Discipline
WoS
null
With the disturbing increase of children with Autism Spectrum Disorder (ASD) in Malaysia, a lot of efforts and studies are put forward towards understanding and managing matters related to ASD. One way is to find means of easing the social communications among these children and their caretakers, particularly during intervention. If the caretaker is able to comprehend the children emotional state of mind prior to therapy, some sort of trust and attachment will be developed. However, regulating emotions is a challenge to these children. Nonverbal communication such as facial expression is difficult for ASD children. Therefore, we proposed the use of walking patterns (i.e. gait) to detect the type of emotions of ASD children. Even though using gait for emotion recognition is common among normal individuals, none can be found done on children with ASD. Thus, the aim of this paper is to conduct a preliminary review on the possibilities of carrying out gait-based emotion detection among ASD children with regards to the emotional types, gait parameters and methods of gait data acquisition. (c) 2015 The Authors. Published by Elsevier B.V.
Emotion recognition;Autism Spectrum Disorder;Gait analysis
WoS
null
With the dramatic growth of network attacks, a new set of challenges has raised in the field of electronic security. Undoubtedly, firewalls are core elements in the network security architecture. However, firewalls may include policy anomalies resulting in critical network vulnerabilities. A substantial step towards ensuring network security is resolving packet filter conflicts. Numerous studies have investigated the discovery and analysis of filtering rules anomalies. However, no such emphasis was given to the resolution of these anomalies. Legacy work for correcting anomalies operate with the premise of creating totally disjunctive rules. Unfortunately, such solutions are impractical from implementation point of view as they lead to an explosion of the number of firewall rules. In this paper, we present a new approach for performing assisted corrective actions, which in contrast to the-state-of-the-art family of radically disjunctive approaches, does not lead to a prohibitive increase of the firewall size. In this sense, we allow relaxation in the correction process by clearly distinguishing between constructive anomalies that can be tolerated and destructive anomalies that should be systematically fixed. This distinction between constructive and destructive anomalies is assisted by the network administrator which supports the fact that he has a major role in the heart of the corrective process. To the best of our knowledge, such assisted approach for relaxed resolution of packet filter conflicts was not investigated before. We provide theoretical analysis that demonstrate that our scheme results is sound and indeed result into a conflict-free policy. In addition, we have implemented our solution in a user friendly tool.
Firewall Policy;Filtering Rules;Anomalies Discovery;Anomalies Correction
WoS
null
With the emergence of Big Data, the use of NoSQL (Not only SQL) technology is rising rapidly among internet companies and other enterprises. Benefits include simplicity of design, horizontal scaling and finer control over availability. NoSQL databases are increasingly considered a viable alternative to relational databases, as more organizations recognize that its schema less data model is a better method for handling the large volumes of structured, semi structured and unstructured data, being captured and processed today. For example NoSQL databases are often used to collect and store social media data. This paper aims to introduce the concepts behind NoSQL, provides a review of relevant literature, highlights the different NoSQL database types, and provide arguments for and against adopting NoSQL. A small prototype application has been developed to assess the stated NoSQL benefits and illustrate the differences between the SQL and NoSQL approaches. The last section of the paper offers some conclusions and recommendations for further research to expand upon our research work.
NoSQL;SQL;databases;structured data;unstructured data;Big Data
WoS
null
With the emergence of the Microsoft Kinect sensor, many developer communities and research groups have found countless uses and have already published a wide variety of papers that utilize the raw depth images for their specific goals. New methods and applications that use the device generally require an appropriately large ensemble of data sets with accompanying ground truth for testing purposes, as well as accurate models that account for the various systematic and stochastic contributors to Kinect errors. Current error models, however, overlook the intermediate infrared (IR) images that directly contribute to noisy depth estimates. We, therefore, propose a high fidelity Kinect IR and depth image predictor and simulator that models the physics of the transmitter/receiver system, unique IR dot pattern, disparity/depth processing technology, and random intensity speckle and IR noise in the detectors. The model accounts for important characteristics of Kinect's stereo triangulation system, including depth shadowing, IR dot splitting, spreading, and occlusions, correlation-based disparity estimation between windows of measured and reference IR images, and subpixel refinement. Results show that the simulator accurately produces axial depth error from imaged flat surfaces with various tilt angles, as well as the bias and standard lateral error of an object's horizontal and vertical edge.
Computer-aided design (CAD);infrared (IR) dot pattern;Microsoft Kinect;simulation;speckle noise;structured-light 3-D scanner
WoS
null
With the enormous genetic plasticity of malaria parasite, the challenges of developing a potential malaria vaccine candidate with highest efficacy still remain. This study has incorporated a bioinformatics-based screening approach to explore potential vaccine candidates in Plasmodium falciparum proteome. A systematic strategy was defined to screen proteins from the Malaria Parasite Metabolic Pathways (MPMP) database, on the basis of surface exposure, non-homology with host proteome, orthology with related Plasmodium species, and MHC class I and II binding promiscuity. The approach reported PF3D7_1428200, a putative metabolite transporter protein, as a novel vaccine candidate. RaptorX server was used to generate the 3D model of the protein and was validated by PROCHECK. Furthermore, the predicted B cell and T cell epitopes with the highest score were subjected to energy minimization by molecular dynamics simulation to examine their stability within a solvent system. Results from this study could facilitate selection of proteins for entry into vaccine production pipeline in future.
Plasmodium falciparum;Proteome;Vaccine;Structure prediction;Molecular dynamics simulation
WoS
null
With the ever growing demand of location-independent access to Autonomous Decentralized Systems (ADS), anomaly detection scheme for industrial Ethernet, which highly is satisfied with demanding real-time and reliable industrial applications, becomes one of the most pressing subjects in ADS. In this paper, we present an innovative approach to build a traffic model based on structural time series model for a chemical industry system. A basic structural model that decomposes time series into four items is established by the stationary analysis of industrial traffic. Parameters in the model are identified by the state space model, which is conducted from the training sequence using standard Kalman filter recursions and the EM algorithm. Furthermore, the performance of state space model is evaluated by the experimental results that confirm significant improvement in detection accuracy and the validity of abnormal data localization. (C) 2016 Elsevier B.V. All rights reserved.
Autonomous Decentralized System;Network security;State space model;Time series
WoS
null
With the ever increasing human dependency on The Internet for performing various activities such as banking, shopping or transferring money, there equally exists a need for safe and secure transactions. This need automatically translates to the requirement of increased network security and better and fast encryption algorithms. This paper addresses the above issue by introducing a novel methodology by utilizing the AES method of encryption and also further enhances the same with the help of visual cryptography. The complete algorithm was designed and developed using the MATLAB 2011b software. The paper also discusses a hardware approach for deploying the algorithm and elaborates on an area, speed and power efficient design on a Spartan 3E Xilinx FPGA platform.
AES algorithm;decryption;encryption;FPGA;LUT approach;mix column;splitting method;Verilog;visual cryptography;watermarking
WoS
null
With the ever-growing geriatric population, research on brain diseases such as dementia is more imperative now than ever. The most prevalent of all dementias is Alzheimer's disease, a progressive neurodegenerative disease that presents with deficits in memory, cognition, motor skills, and a general decline in the quality of life. The social and economic burden associated with Alzheimer's disease is tremendous and is projected to grow even greater over the coming years. There is a specific need to elucidate and improve the treatments available, not only to alleviate the symptoms related to dementias such as Alzheimer's but also to prevent the formation of the disease. This is an effort that can be expedited and made more efficient by utilizing an animal model such as the zebrafish. This paper reviews the utility of zebrafish in Alzheimer's research by examining research on a sampling of the treatments available for the disease, specifically donepezil, memantine, and methylene blue. The human model and the shortcomings of the rodent model are also discussed. (C) 2017 Wolters Kluwer Health, Inc. All rights reserved.
Alzheimer's disease;dementia;pharmacology;zebrafish
WoS
null
With the ever-growing prevalence of dementia, nursing costs are increasing, while the ability to live independently vanishes. Dem@Home is an ambient assisted living framework to support independent living while receiving intelligent clinical care. Dem@Home integrates a variety of ambient and wearable sensors together with sophisticated, interdisciplinary methods of image and semantic analysis. Semantic Web technologies, such as OWL 2, are extensively employed to represent sensor observations and application domain specifics as well as to implement hybrid activity recognition and problem detection. Complete with tailored user interfaces, clinicians are provided with accurate monitoring of multiple life aspects, such as physical activity, sleep, complex daily tasks and clinical problems, leading to adaptive non-pharmaceutical interventions. The method has been already validated for both recognition performance and improvement on a clinical level, in four home pilots.
Ambient assisted living;Sensors;Semantic web;Ontologies;Reasoning;Context-awareness;Dementia
WoS
null
With the ever-increasing usage of internet, the availability of digital data is in tremendous demand. In this context, it is essential to protect the ownership of the data and to be able to find the guilty user. In this paper, a fingerprinting scheme is proposed to provide protection for Numeric Relational Database (RDB), which focuses on challenges like: 1. Minimum distortion in Numeric database, 2. Usability preservation, 3. Non-violation of the requirement of blind decoding. When the digital data in concern is numeric in nature the usability of data needs to be keenly preserved, this is made possible by achieving minimum distortion.
Minimum Distortion;fingerprinting;Tardos code;watermarking;usability constraints
WoS
null
With the evolution of the research on network moving target defense (MTD), the selection of optimal strategy has become one of the key problems in current research. Directed to the problem of the improper defensive strategy selection caused by inaccurately characterizing the attack and defense game in MTD, optimal strategy selection for MTD based on Markov game (MG) is proposed to balance the hopping defensive revenue and network service quality. On the one hand, traditional matrix game structure often fails to describe MTD confrontation accurately. To deal with this inaccuracy, MTD based on MG is constructed. Markov decision process is used to characterize the transition among network multi-states. Dynamic game is used to characterize the multi-phases of attack and defense inMTDcircumstances. Besides, it converts all the attack and defense actions into the changes in attack surface or the ones in exploration surface, thus improving the universality of the proposed model. On the other hand, traditional models care little about defense cost in the process of optimal strategy selection. After comprehensively analyzing the impact of defense cost and defense benefit on the strategy selection, an optimal strategy selection algorithm is designed to prevent the deviation of the selected strategies from actual network conditions, thus ensuring the correctness of optimal strategy selection. Finally, the simulation and the deduction of the proposed approach are given in case study so as to demonstrate the feasibility and effectiveness of the proposed strategy optimal selection approach.
Moving target defense;Markov game;optimal strategy selection;attack surface;exploration surface
WoS
null
With the expansion of wireless network technologies and the emergence of novel mobile applications, the 3rd Generation (3G) communication is moving to the 4th Generation (4G) communication. Comparing to preceding versions, one elementary difference is that 4G wireless networks will operate entirely on the TCP/IP, which would cause greater risks in terms of safety and reliability. In this paper, a new high security architecture for Long Term Evolution (LTE) core network server is proposed and designed. There are asynchronous array of simple processors and two physically isolated high-speed system buses in the security architecture, which ensure only one bus can be connected to array of processors at the same time. Experiment results show that the security architecture can effectively prevent external threats from accessing network resources.
4G communication;network security;long term evolution;security computer architecture
WoS
null
With the explosion of 3D character animation across contemporary screen media, more people, disciplines and technologies are now engaging with its production. Explicit representations of computer animation processes help facilitate engagement at a high level, however fail to convey the depth of specialised creative techniques, technical processes and discipline language that is prevalent during the act of animating. This paper introduces the Mk I production model', a conceptual process which through its novel use of the software engineering methodology agent-oriented modelling', conveys such specialised attributes within an explicit process for producing 3D character animation. To gather insights into how this model is used and perceived by animators within a production environment, it was entrenched within a large undergraduate student animation project named Gunter's Fables', where it was positioned as the principal device to inform animators of the production process and their expected activity. The project management team also used the model in weekly peer review sessions as a basis to evaluate animation, and to convey progress and achievement with a colour rating scale. Upon completion of the production phase, the project's 12 student animators successfully delivered 41 short, 10-15 second 3D character animation scenarios that were deemed to be of a consistent and fit for purpose quality. Findings from regular sweatbox' review sessions and questionnaires suggest that further investigation and iterative development of the model may improve user engagement with the process. However, the model's demonstrated ability to inform a depth of production process supports the notion that this novel production concept presents a way forward in the communication and production of 3D character animation, and allied animation activities.
3D animation;agent-oriented modelling;production process
WoS
null
With the explosive growth of digital data communications in synergistic operating networks and cloud computing service, hyperconnected manufacturing collaboration systems face the challenges of extracting, processing, and analyzing data from multiple distributed web sources. Although semantic web technologies provide the solution to web data interoperability by storing the semantic web standard in relational databases for processing and analyzing of web-accessible heterogeneous digital data, web data storage and retrieval via the predefined schema of relational / SQL databases has become increasingly inefficient with the advent of big data. In response to this problem, the Hadoop Ecosystem System is being adopted to reduce the complexity of moving data to and from the big data cloud platform. This paper proposes a novel approach in a set of the Hadoop tools for information integration and interoperability across hyperconnected manufacturing collaboration systems. In the Hadoop approach, data is "Extracted" from the web sources, "Loaded" into a set of the NoSQL Hadoop Database (HBase) tables, and then "Transformed" and integrated into the desired format model with Hive's schema-on-read. A case study was conducted to illustrate that the Hadoop Extract-Load-Transform (ELT) approach for the syntax and semantics web data integration could be adopted across the global smartphone value chain. (C) 2016 Published by Elsevier B.V.
Hyperconnected Manufacturing Collaboration System;Hadoop Ecosystem System;Hadoop Database (HBase);Hadoop Hive;Extract-Load-Transform (ELT)
WoS
null
With the explosive growth of mobile terminal access to the Network and the shortage of IPv4, the Network Address Translation (NAT) technology has become more and more widely used. The technology not only provides users with convenient access to the Internet, but also brings trouble to network operators and regulatory authorities. This system NAT detection using NetFlow data, is often used for monitoring and forensics analysis in large networks. In the paper, in order to detect NAT devices, an Out-in Activity Degree method based on network behavior is proposed. Our approach works completely passively and is based on NetFlow data only. Our approach gets accuracy of 91.2 % in real large-scale network for a long time.
Network address translation;Network security;Netflow;Out-in activity degree
WoS
null
With the fast development of high voltage DC (HVDC) cable, cable insulation under DC conditions has got more attention. In this paper, tests were conducted to study the electrical tree initiation in silicone rubber (SIR) under DC and polarity reversal voltages. It is found that electrical tree initiation has significant polarity effects under both DC and polarity reversal voltages. There are only single-branch-like trees and branch-like trees under DC and polarity reversal voltages. As for pre-stressing effects under polarity reversal, the pre-stressing voltage has positive effects to electrical tree initiation, while the pre-stressing time has little influence. Space charge distribution of SIR under high electric field was studied with flat plate pulsed electro-acoustic (PEA) system, and their characteristics were discussed to explain this phenomenon. Moreover, different treeing breakdown phenomenon is found under polarity reversal voltage which differs from that under DC voltage. The existence of fast charges and slow charges gives reasonable explanation to it. Special attention should be paid to the transient situation like polarity reversal which would result in irreversible effects more easily, and affect true length of life for HVDC cables. (C) 2017 Elsevier B.V. All rights reserved.
Silicone rubber;Electrical tree;HVDC;Polarity reversal voltage;Space charge;Treeing breakdown
WoS
null
With the fast development of solar energy harvesting technology, various portable harvesters with excellent energy capture properties are highly desired, especially those devices used for facades and roofs of buildings. This study proposed a new energy absorption model by integrating wave-guiding polymer materials and fluorescent substance coated textiles, which aimed to address the issues of current energy shortage and excessive pollutant emissions in the ambient environment. Besides, this textile-based production was multifunctional in that it possessed the compelling features of solar radiation collection, thermal insulation and surface decoration. Compared with the conventional solar energy harvester, the as-fabricated device exhibited the merits of light weight, better mechanical flexibility and wider extended applicability. This study has achieved some progress in the area of solar energy harvesting and could have a possible informative effect on the future related research.
flexible absorption device;fluorescent dyestuff;solar energy harvesting;thermodynamics;waveguiding polymer material
WoS
null
With the fast global adoption of the Materials Genome Initiative (MGI), scientists and engineers are faced with the need to conduct sophisticated data analytics on large datasets to extract knowledge that can be used in modeling the behavior of materials. This raises a new problem for materials scientists: how to create and foster interoperability and share developed software tools and generated datasets. A microstructure-informed cloud-based platform (MiCloud (TM)) has been developed that addresses this need, enabling users to easily access and insert microstructure informatics into computational tools that predict performance of engineering products by accounting for microstructural dependencies on manufacturing provenance. The platform extracts information from microstructure data by employing algorithms including signal processing, machine learning, pattern recognition, computer vision, predictive analytics, uncertainty quantification, and data visualization. The interoperability capabilities of MiCloud and its various web-based applications are demonstrated in this case study by analyzing Ti6AlV4 microstructure data via automatic identification of various features of interest and quantifying its characteristics that are used in extracting correlations and causations for the associated mechanical behavior (e.g., yield strength, cold-dwell debit, etc.). The data were recorded by two methods: (1) backscattered electron (BSE) imaging for extracting spatial and morphological information about alpha and beta phases and (2) electron backscatter diffraction (EBSD) for extracting spatial, crystallographic, and morphological information about microtextured regions (MTRs) of the alpha phase. Extracting reliable knowledge from generated information requires data analytics of a large amount of multiscale microstructure data which necessitates the development of efficient algorithms (and the associated software tools) for data recording, analysis, and visualization. The interoperability of these tools and superior effectiveness of the cloud computing approach are validated by featuring several examples of its use in alpha/beta titanium alloys and Ni-based superalloys, reflecting the anticipated computational cost and time savings via the use of web-based applications in implementations of microstructure-informed integrated computational materials engineering (ICME).
Microtextured regions;Macrozones;Titanium;Cloud computing;ICME;MGI;Material informatics;Higher-order statistics
WoS
null
With the fast proliferation of microgrids integrated into the power grid, several nearby microgrids with common benefit have been coupled to be multi-microgrids (MMGs), which is a significant stage for developing the smart grid. A suitable control structure and corresponding control device need to be proposed for the MMGs operating efficiently. However, mostly micrgrid control devices are still restricted to the single microgrid and have limited application without the common design. The authors develop control devices with a prodigious popularisation value for the MMGs based on the hierarchical control structure. According to the proposed four-layer control architecture (scheduling layer, central control layer, integrated terminal layer and bottom layer), the hardware and software design suitable for the MMGs control devices and its multi-time scale communication architecture has been presented. Furthermore, the appearance design and interface exploitation were introduced as well. Experimental results are provided to demonstrate the validity of the proposed MMGs control structure and devices, which could contribute to the large-scale application of MMGs control devices and provide a reference for other MMGs projects.
distributed power generation;power generation control;hierarchical systems;smart power grids;control engineering computing;user interface management systems;large-scale systems;control device development;multimicrogrids;power grid;smart grid development;micrgrid control devices;hierarchical control structure;four-layer control architecture;scheduling layer;central control layer;integrated terminal layer;bottom layer;hardware design;software design;multitime scale communication architecture;MMG control device;large-scale application;appearance design;interface exploitation
WoS
null
With the fast-growing populations and development, energy crisis have raised critical requirements for utilization of clean solar energy. Some effective methods to capture and convert the solar energy now include photocatalytic degradation, hydrogen production, dye-sensitized solar cells, and photocatalytic selective reaction of organic compounds and so on. As the key factors for the utilization of solar energy, many photocatalysts have been exploited and great achievements have been made. Ceria (CeO2), which possesses unique 4f electrons, has attracted much interest due to their special electronic and optical structures, outstanding physical and chemical properties. This review aims to provide an overview of present progress on the study of photocatalytic performance of the CeO2 and CeO2-based materials for photo-degradation, solar hydrogen production and photo-selective reactions. We also discuss the factors which affect photocatalytic performance of these photocatalysts, including the morphologies, structure and constitution of the CeO2 and CeO2-based nanomaterials. Moreover, the current challenges and future opportunities of the CeO2 and CeO2-based materials for the capture and conversion of solar energy are also discussed.
CeO2;Photocatalysis;Water splitting;CO2 reduction;Selective reaction
WoS
null
With the flood of publicly available data, it allows scientists to explore and discover new findings. Gene expression is one type of biological data which captures the activity inside the cell. Studying gene expression data may expose the mechanisms of disease development. However, with the limitation of computing resources or knowledge in computer programming, many research groups are unable to effectively utilize the data. For about a decade now, various web-based data analysis tools have been developed to analyze gene expression data. Different tools were implemented by different analytical approaches, often resulting in different outcomes. This study conducts a comparative study of three existing web-based gene expression analysis tools, namely Gene-set Activity Toolbox (GAT), NetworkAnalyst and GEO2R using six publicly available cancer data sets. Results of our case study show that NetworkAnalyst has the best performance followed by GAT and GEO2R, respectively.
Gene set activity;Gene expression;Disease classification;Cross-dataset validation;Web-based microarray analysis
WoS
null
With the fully mature and a vast number of available virtualization solutions, there is an uptake in creating the opportunity for remote and/or virtual laboratories to either supplement or fully replace physical networking laboratories. Our approach focuses on nodes, rather than environments hosting the nodes. The paper addresses setting up virtual laboratories made of widely available, general purpose, operating systems based on Linux that act as a network operating system. We discuss several Software Defined Networking solutions, and lay out the configuration setup for virtual laboratories. We evaluate them by the opportunity they provide in the context of learning and a potential experience. We conclude with an observation that with an increasing number of Linux based network operating systems, management of network forwarding devices becomes management of servers, which leads to the unification of the cloud fabric.
remote laboratory;virtual classroom;education in computer networking;Linux-based network operating systems
WoS
null
With the further implementation of quality education, higher vocational education occupies a more and more important position in the national education system and takes on task of cultivation of high-quality talents who have the knowledge of national construction, management, and production for socialism. Ideological and political education is an integral part of educations in colleges and universities, which is equally important for higher vocational education. In recent years, higher vocational education has yielded a series of results in the ideological and political education, while it also encountered a lot of problems. Based on this background, this paper discusses the path of development of ideological and political education of vocational college students under the perspective of collaborative innovation.
Higher vocational;Ideological and political education;Collaborative innovation;Development;Path
WoS
null
With the generalized bilinear operators based on a prime number p = 3, a Hirota-Satsuma-like equation is proposed. Rational solutions are generated and graphically described by using symbolic computation software Maple. (C) 2016 Elsevier Ltd. All rights reserved.
Generalized bilinear operator;Hirota-Satsuma-like equation;Rational solution
WoS
null
With the global environmental pollution and fossil energy shortage problems getting increasingly serious, renewable energy sources (RES) are drawing more and more attention. In China, RES are experiencing rapid development. However, because of the randomness of RES and the volatility of power output, energy storage technology is needed to chip peak off and fill valley up, promoting RES utilization and economic performance. So to speak, energy storage is the precondition of large-scale integration and consumption of RES. However, China's energy storage industry is at the exploration stage and far from commercialization. This restricts the development of RES to certain extent. For this reason, this paper will concentrate on China's energy storage industry. First, it summarizes the developing status of energy storage industry in China. Then, this paper analyzes the existing problems of China's energy storage industry from the aspects of technical costs, standard system, benefit evaluation and related policies. Finally, solutions are proposed based on the above problems to promote the sound development of China's energy storage industry.
China's energy storage industry;Energy storage technologies;Energy storage policies
WoS
null
With the globalization of economic development, the application of mathematics throughout the various areas of probability and statistics is the use of modern engineering, social and economic research is a core of mathematics. Mathematical modeling provides a new way of thinking to the economic problems from the economic decision-making issues mathematical model of probability and statistics in the economic field.
Probability Statistics Mathematical Model;Investment;Application
WoS
null
With the great success of the second-generation wireless telephone technology and the third-generation mobile telecommunications technology, and the fast development of the fourth-generation mobile telecommunications technology, the phase of fifth-generation mobile networks or fifth-generation wireless systems (5G) is coming. In this article, we indicate the open research issues of 5G security and trust in the context of virtualized networking and software-defined networking. We further propose a framework of security and trust focusing on solving 5G network security issues. The proposed framework applies adaptive trust evaluation and management technologies and sustainable trusted computing technologies to ensure computing platform trust and achieve software-defined network security. It adopts cloud computing to securely deploy various trustworthy security services over the virtualized networks. We analyze that the framework can support and satisfy all security requirements specified in standardization. We also suggest future research work according to the proposed framework and discuss the advantages of our framework in terms of practical deployment. Copyright (C) 2015 John Wiley & Sons, Ltd.
5G security;network function virtualization;software-defined networking;trusted computing;trust management;cloud computing
WoS
null
With the green building research rising in these years, the rational mode of human beings and nature in China traditional architecture, which contains rich experience about ecological design, should be re-examined. The green design strategies of neo-vernacular is also becoming more and more important so we have to pay real attention to. This article demonstrates an integrated ecosystem approach in a design process of a neo-vernacular architecture called Xijie in the hope of providing certain reference and help for similar study. (C) 2016 The Authors. Published by Elsevier Ltd.
neo-vernacular architecture;architectural design;green strategies;computer simulation
WoS
null
With the growing concern about the safety, durability and normal use function of large and important bridge, the research and development of bridge health monitoring system is developed. The purpose of this paper is to realize the system of real-time monitoring and security analysis results show the three-dimensional visualization, this paper studies the three-dimensional visual model of the bridge real-time security monitoring system. According to the functional requirements, the system is divided into three parts, such as 3D graphics engine, 2D graphics engine, database interface and so on. According to the idea of modern software engineering, a complete set of system architecture is designed, which makes the above modules work with high cohesion and low coupling. The system to the system development methods of the modern software engineering based, to all kinds of design patterns as a skeleton, integrated use of 3D modeling and rendering technology and GDI + and SQL database technology on the MFC platform is a great progress of the combination of computer software technology and monitoring system. At the same time, the success of the system development, the trial prove greatly improves the efficiency of the monitoring system of bridge, to predict bridge safety hidden trouble, ensuring the safe operation of the bridge to the very important role to promote the bridge monitoring technology development. (C) 2017 Published by Elsevier Ltd.
Three dimensional visualization;bridge health;remote monitoring;system design;database technology
WoS
null
With the growing interest in computational models of visual attention, saliency prediction has become an important research topic in computer vision. Over the past years, many different successful saliency models have been proposed especially for image saliency prediction. However, these models generally do not consider the dynamic nature of the scenes, and hence, they work better on static images. To date, there has been relatively little work on dynamic saliency that deals with predicting where humans look at videos. In addition, previous studies showed that how the feature integration is carried out is very crucial for more accurate results. Yet, many dynamic saliency models follow a similar simple design and extract separate spatial and temporal saliency maps which are then integrated together to obtain the final saliency map. In this paper, we present a comparative study for different feature integration strategies in dynamic saliency estimation. We employ a number of low and high-level visual features such as static saliency, motion, faces, humans and text, some of which have not been previously used in dynamic saliency estimation. In order to explore the strength of feature integration strategies, we investigate four learning-based (SVM, Gradient Boosting, NNLS, Random Forest) and two transformation based (Mean, Max) fusion methods, resulting in six new dynamic saliency models. Our experimental analysis on two different dynamic saliency benchmark datasets reveal that our models achieve better performance than the individual features. In addition, our learning-based models outperform the state-of-the-art dynamic saliency models.
Dynamic saliency;Feature integration;Learning visual saliency
WoS
null
With the growing use of renewable energy sources, Distributed Generation (DG) systems are rapidly spreading. Embedding DG to the distribution network may be costly due to the grid reinforcements and control adjustments required in order to maintain the electrical network reliability. Deterministic load flow calculations are usually employed to assess the allowed DG penetration in a distribution network in order to ensure that current or voltage limits are not exceeded. However, these calculations may overlook the risk of limit violations due to uncertainties in the operating conditions of the networks. To overcome this limitation, related to both injection and demand profiles, the present paper addresses the problem of DG penetration with a Monte Carlo technique that accounts for the intrinsic variability of electric power consumption. The power absorbed by each load of a medium voltage network is characterized by a load variation curve; a probabilistic load flow is then used for computing the maximum DG power that can be connected to each bus without determining a violation of electric constraints. A distribution network is studied and a comparison is provided between the results of the deterministic load flow and probabilistic load flow analyses. (C) 2014 Elsevier Ltd. All rights reserved.
Distributed generation;Electrical distribution network;Probabilistic load flow;Monte Carlo simulation
WoS
null
With the help of new design tools, manufacturing-integrated solutions can be generated that concurrently consider function and process. Based on the design pattern matrix, solution elements can be developed that realize the product function by systematically utilizing manufacturing-induced properties. The developed manufacturing-integrated product solutions are refined using computer-aided methods (feature-based modeling and information modeling). A product embodiment is generated that is specifically tailored to the chosen manufacturing technology. An integrated information model allows the introduced tools to be used throughout the entire development process. The example of a linear flow split snap-fit fastening illustrates how the tools beneficially interact and realize manufacturing potential, resulting in an innovative product design.
Manufacturing-induced properties;design pattern matrix;feature-based modeling;computer-aided design;information modeling
WoS
null
With the help of symbolic computation, the Benjamin-Bona-Mahony (BBM) equation with variable coefficients is presented, which was proposed for the first time by Benjamin as the regularized long-wave equation and originally derived as approximation for surface water waves in a uniform channel. By employing the improved (G'/G)-expansion method, the truncated Painleve expansion method, we derive new auto-Backlund transformation, hyperbolic solutions, a variety of traveling wave solutions, soliton-type solutions and two solitary wave solutions of the BBM equation. These obtained solutions possess abundant structures. The figures corresponding to these solutions are illustrated to show the particular localized excitations and the interactions between two solitary waves.
improved (G '/G)-expansion method;truncated Painleve;expansion method;symbolic computation;hyperbolic solutions;auto-Backlund transformation
WoS
null
With the help of the Boussinesq perturbation expansion, a new basic equation describing the long, small-amplitude, unidirectional wave motion in shallow water with surface tension is derived to fourth order, namely a higher-order Korteweg-de Vries (KdV) equation. The procedure for deriving this equation assumes that the relation between the small parameter , which measures the ratio of wave amplitude to undisturbed fluid depth, and the small parameter , which measures the square of the ratio of fluid depth to wave length, is taken in the form , where is a small, dimensionless parameter which is the order of the amplitude of the motion. Hirota's bilinear method is used to investigate one- and two-soliton solutions for this new higher-order KdV equation.
Higher-order KdV equation;Soliton solutions;Water wave problem;Hirota's bilinear method
WoS
null
With the help of the symbolic computation system Maple and Riccati equation (xi'=a(0)+a(1 xi)+a(2)xi(2)) expansion method and a variable separation method, some complex wave solutions with q=C(1)x+C(2)y+C(3)t+R(x, y, t) of the (2+1)-dimensional Korteweg-de Vries system is derived. Based on the derived solitary wave solution, some novel complex wave localized excitations such as complex wave fusion and complex wave annihilation are investigated.
Riccati equation expansion method;Korteweg-de Vries system;complex wave solutions;localized excitations
WoS
null
With the help of the symbolic computation system Maple, the Riccati equation mapping approach and a linear variable separation approach, a new family of complex solutions for the (2+1)-dimensional Boiti-Leon-Pempinelli system (BLP) is derived. Based on the derived solitary wave solution, some novel complex wave localized excitations are obtained.
Riccati mapping approach;Boiti-Leon-Pempinelli system;complex solutions;localized excitations
WoS
null
With the help of the symbolic computation system, Maple and Riccati equation (xi ' = a(0) + a(1)xi + a(2)xi(2)), expansion method, and a linear variable separation approach, a new family of exact solutions with q = lx + my + nt + Gamma (x, y, t) for the (2+1)-dimensional generalized Calogero-Bogoyavlenskii-Schiff system (GCBS) are derived. Based on the derived solitary wave solution, some novel localized excitations such as fusion, fission, and annihilation of complex waves are investigated.
Riccati equation expansion method;GCBS system;exact solutions;fusion;fission and annihilation of complex waves
WoS
null
With the high prevalence of potentially traumatic events and subsequent associated mental health problems and impaired functioning, there is a need for graduate training in trauma psychology. A national survey was conducted of all North American doctoral programs in psychology to ascertain the current status of training in trauma. Training directors were sent email invitations and asked to complete a Web based survey. A total of 151 out of 398 responded with adequate information about their program, and were included in the analyses. Only 1 in 5 offered a trauma psychology course as well as a practicum specifically working with traumatized populations. The most commonly cited barriers to addressing trauma were limited capacity for elective courses and little time and resources. Attention to trauma issues is important for the development of competent professional psychologists. Ways that doctoral programs can facilitate development of knowledge, skills and attitudes in trauma psychology are discussed.
trauma;posttraumatic stress disorders;psychology;graduate training
WoS
null
With the importance attached to issues of agriculture by China, there are more and more infrastructure construction projects in the countryside. Water conservancy project is an important one. Such infrastructure can not only promote sustained and stable development of the countryside, but also guarantee food security of China correspondingly and improve rural economic development and incomes of peasants. However, small water conservancy projects in the countryside of China have some problems which might cause the failure of rural development to keep up with the step and demand of the era. Therefore, it is required to take corresponding measures specific to these problems so as to promote the development of China. This paper analyzes problems of small water conservancy projects in the countryside of China and strives to improve the work efficiency of engineering construction management.
Countryside;Small water conservancy project;Construction management
WoS
null
With the increase in device variability, the performance uncertainty poses a daunting challenge to analog/mixed-signal circuit design. This situation requires a robust design approach to add large margins to the circuit and system-level specification to ensure correct operation and the overall yield. In this paper, we propose a new robust design approach by using norm metrics to quantify the robustness for both design parameters and performance uncertainty. In addition, we adopt a surrogating procedure to achieve robustness in design space and to reduce uncertainty in performance space. The end result of the proposed method is a Pareto-surface that provides the designer with trade-offs between design robustness and performance uncertainty. One advantage of this new approach is the ability to take into account the strong nonlinear relationship between performance and design parameters. Considering a set of highly nonlinear circuit performances, we demonstrate the effectiveness of this robust design framework on a fully CMOS operational amplifier circuit. (C) 2015 Elsevier B.V. All rights reserved.
Robust design;Robustness metric;ElasticR method;Surrogates;Process variations
WoS