The deegree framework - Spatial Data Infrastructure solution for end-users and developers
NASA Astrophysics Data System (ADS)
Kiehle, Christian; Poth, Andreas
2010-05-01
The open source software framework deegree is a comprehensive implementation of standards as defined by ISO and Open Geospatial Consortium (OGC). It has been developed with two goals in mind: provide a uniform framework for implementing Spatial Data Infrastructures (SDI) and adhering to standards as strictly as possible. Although being open source software (Lesser GNU Public License, LGPL), deegree has been developed with a business model in mind: providing the general building blocks of SDIs without license fees and offer customization, consulting and tailoring by specialized companies. The core of deegree is a comprehensive Java Application Programming Interface (API) offering access to spatial features, analysis, metadata and coordinate reference systems. As a library, deegree can and has been integrated as a core module inside spatial information systems. It is reference implementation for several OGC standards and based on an ISO 19107 geometry model. For end users, deegree is shipped as a web application providing easy-to-set-up components for web mapping and spatial analysis. Since 2000, deegree has been the backbone of many productive SDIs, first and foremost for governmental stakeholders (e.g. Federal Agency for Cartography and Geodesy in Germany, the Ministry of Housing, Spatial Planning and the Environment in the Netherlands, etc.) as well as for research and development projects as an early adoption of standards, drafts and discussion papers. Besides mature standards like Web Map Service, Web Feature Service and Catalogue Services, deegree also implements rather new standards like the Sensor Observation Service, the Web Processing Service and the Web Coordinate Transformation Service (WCTS). While a robust background in standardization (knowledge and implementation) is a must for consultancy, standard-compliant services and encodings alone do not provide solutions for customers. The added value is comprised by a sophisticated set of client software, desktop and web environments. A focus lies on different client solutions for specific standards like the Web Processing Service and the Web Coordinate Transformation Service. On the other hand, complex geoportal solutions comprised of multiple standards and enhanced by components for user management, security and map client functionality show the demanding requirements of real world solutions. The XPlan-GML-standard as defined by the German spatial planing authorities is a good example of how complex real-world requirements can get. XPlan-GML is intended to provide a framework for digital spatial planning documents and requires complex Geography Markup Language (GML) features along with Symbology Encoding (SE), Filter Encoding (FE), Web Map Services (WMS), Web Feature Services (WFS). This complex infrastructure should be used by urban and spatial planners and therefore requires a user-friendly graphical interface hiding the complexity of the underlying infrastructure. Based on challenges faced within customer projects, the importance of easy to use software components is focused. SDI solution should be build upon ISO/OGC-standards, but more important, should be user-friendly and support the users in spatial data management and analysis.
Map-IT! A Web-Based GIS Tool for Watershed Science Education.
ERIC Educational Resources Information Center
Curtis, David H.; Hewes, Christopher M.; Lossau, Matthew J.
This paper describes the development of a prototypic, Web-accessible GIS solution for K-12 science education and citizen-based watershed monitoring. The server side consists of ArcView IMS running on an NT workstation. The client is built around MapCafe. The client interface, which runs through a standard Web browser, supports standard MapCafe…
Security and Dependability Solutions for Web Services and Workflows
NASA Astrophysics Data System (ADS)
Kokolakis, Spyros; Rizomiliotis, Panagiotis; Benameur, Azzedine; Sinha, Smriti Kumar
In this chapter we present an innovative approach towards the design and application of Security and Dependability (S&D) solutions for Web services and service-based workflows. Recently, several standards have been published that prescribe S&D solutions for Web services, e.g. OASIS WS-Security. However,the application of these solutions in specific contexts has been proven problematic. We propose a new framework for the application of such solutions based on the SERENITY S&D Pattern concept. An S&D Pattern comprises all the necessary information for the implementation, verification, deployment, and active monitoring of an S&D Solution. Thus, system developers may rely on proven solutions that are dynamically deployed and monitored by the Serenity Runtime Framework. Finally, we further extend this approach to cover the case of executable workflows which are realised through the orchestration of Web services.
An efficient architecture to support digital pathology in standard medical imaging repositories.
Marques Godinho, Tiago; Lebre, Rui; Silva, Luís Bastião; Costa, Carlos
2017-07-01
In the past decade, digital pathology and whole-slide imaging (WSI) have been gaining momentum with the proliferation of digital scanners from different manufacturers. The literature reports significant advantages associated with the adoption of digital images in pathology, namely, improvements in diagnostic accuracy and better support for telepathology. Moreover, it also offers new clinical and research applications. However, numerous barriers have been slowing the adoption of WSI, among which the most important are performance issues associated with storage and distribution of huge volumes of data, and lack of interoperability with other hospital information systems, most notably Picture Archive and Communications Systems (PACS) based on the DICOM standard. This article proposes an architecture of a Web Pathology PACS fully compliant with DICOM standard communications and data formats. The solution includes a PACS Archive responsible for storing whole-slide imaging data in DICOM WSI format and offers a communication interface based on the most recent DICOM Web services. The second component is a zero-footprint viewer that runs in any web-browser. It consumes data using the PACS archive standard web services. Moreover, it features a tiling engine especially suited to deal with the WSI image pyramids. These components were designed with special focus on efficiency and usability. The performance of our system was assessed through a comparative analysis of the state-of-the-art solutions. The results demonstrate that it is possible to have a very competitive solution based on standard workflows. Copyright © 2017 Elsevier Inc. All rights reserved.
Eagleson, Roy; Altamirano-Diaz, Luis; McInnis, Alex; Welisch, Eva; De Jesus, Stefanie; Prapavessis, Harry; Rombeek, Meghan; Seabrook, Jamie A; Park, Teresa; Norozi, Kambiz
2017-03-17
With the increasing implementation of web-based, mobile health interventions in clinical trials, it is crucial for researchers to address the security and privacy concerns of patient information according to high ethical standards. The full process of meeting these standards is often made more complicated due to the use of internet-based technology and smartphones for treatment, telecommunication, and data collection; however, this process is not well-documented in the literature. The Smart Heart Trial is a single-arm feasibility study that is currently assessing the effects of a web-based, mobile lifestyle intervention for overweight and obese children and youth with congenital heart disease in Southwestern Ontario. Participants receive telephone counseling regarding nutrition and fitness; and complete goal-setting activities on a web-based application. This paper provides a detailed overview of the challenges the study faced in meeting the high standards of our Research Ethics Board, specifically regarding patient privacy. We outline our solutions, successes, limitations, and lessons learned to inform future similar studies; and model much needed transparency in ensuring high quality security and protection of patient privacy when using web-based and mobile devices for telecommunication and data collection in clinical research.
NASA Astrophysics Data System (ADS)
Agrawal, Arun; Koff, David; Bak, Peter; Bender, Duane; Castelli, Jane
2015-03-01
The deployment of regional and national Electronic Health Record solutions has been a focus of many countries throughout the past decade. A major challenge for these deployments has been support for ubiquitous image viewing. More specifically, these deployments require an imaging solution that can work over the Internet, leverage any point of service device: desktop, tablet, phone; and access imaging data from any source seamlessly. Whereas standards exist to enable ubiquitous image viewing, few if any solutions exist that leverage these standards and meet the challenge. Rather, most of the currently available web based DI viewing solutions are either proprietary solutions or require special plugins. We developed a true zero foot print browser based DI viewing solution based on the Web Access DICOM Objects (WADO) and Cross-enterprise Document Sharing for Imaging (XDS-I.b) standards to a) demonstrate that a truly ubiquitous image viewer can be deployed; b) identify the gaps in the current standards and the design challenges for developing such a solution. The objective was to develop a viewer, which works on all modern browsers on both desktop and mobile devices. The implementation allows basic viewing functionalities of scroll, zoom, pan and window leveling (limited). The major gaps identified in the current DICOM WADO standards are a lack of ability to allow any kind of 3D reconstruction or MPR views. Other design challenges explored include considerations related to optimization of the solution for response time and low memory foot print.
A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability.
Huang, Chih-Yuan; Wu, Cheng-Hung
2016-08-31
The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human's daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner.
2005-04-12
Hardware, Database, and Operating System independence using Java • Enterprise-class Architecture using Java2 Enterprise Edition 1.4 • Standards based...portal applications. Compliance with the Java Specification Request for Portlet APIs (JSR-168) (Portlet API) and Web Services for Remote Portals...authentication and authorization • Portal Standards using Java Specification Request for Portlet APIs (JSR-168) (Portlet API) and Web Services for Remote
Bioinformatics data distribution and integration via Web Services and XML.
Li, Xiao; Zhang, Yizheng
2003-11-01
It is widely recognized that exchange, distribution, and integration of biological data are the keys to improve bioinformatics and genome biology in post-genomic era. However, the problem of exchanging and integrating biology data is not solved satisfactorily. The eXtensible Markup Language (XML) is rapidly spreading as an emerging standard for structuring documents to exchange and integrate data on the World Wide Web (WWW). Web service is the next generation of WWW and is founded upon the open standards of W3C (World Wide Web Consortium) and IETF (Internet Engineering Task Force). This paper presents XML and Web Services technologies and their use for an appropriate solution to the problem of bioinformatics data exchange and integration.
Server-Based and Server-Less Byod Solutions to Support Electronic Learning
2016-06-01
Knowledge Online NSD National Security Directive OS operating system OWA Outlook Web Access PC personal computer PED personal electronic device PDA...mobile devices, institute mobile device policies and standards, and promote the development and use of DOD mobile and web -enabled applications” (DOD...with an isolated BYOD web server, properly educated system administrators must carry out and execute the necessary, pre-defined network security
Leveraging the Semantic Web for Adaptive Education
ERIC Educational Resources Information Center
Kravcik, Milos; Gasevic, Dragan
2007-01-01
In the area of technology-enhanced learning reusability and interoperability issues essentially influence the productivity and efficiency of learning and authoring solutions. There are two basic approaches how to overcome these problems--one attempts to do it via standards and the other by means of the Semantic Web. In practice, these approaches…
Boulos, Maged N Kamel; Honda, Kiyoshi
2006-01-01
Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699
A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability
Huang, Chih-Yuan; Wu, Cheng-Hung
2016-01-01
The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human’s daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner. PMID:27589759
NASA Astrophysics Data System (ADS)
Gao, Jerry Z.; Zhu, Eugene; Shim, Simon
2003-01-01
With the increasing applications of the Web in e-commerce, advertising, and publication, new technologies are needed to improve Web graphics technology due to the current limitation of technology. The SVG (Scalable Vector Graphics) technology is a new revolutionary solution to overcome the existing problems in the current web technology. It provides precise and high-resolution web graphics using plain text format commands. It sets a new standard for web graphic format to allow us to present complicated graphics with rich test fonts and colors, high printing quality, and dynamic layout capabilities. This paper provides a tutorial overview about SVG technology and its essential features, capability, and advantages. The reports a comparison studies between SVG and other web graphics technologies.
A Pragmatic Approach to Sustainable Interoperability for the Web 2.0 World
NASA Astrophysics Data System (ADS)
Wright, D. J.; Sankaran, S.
2015-12-01
In the geosciences, interoperability is a fundamental requirement. Members of various standards organizations such as the OGC and ISO-TC 211 have done yeomen services to promote a standards-centric approach to manage the interoperability challenges that organizations face today. The specific challenges that organizations face when adopting interoperability patterns are very many. One approach, that of mandating the use of specific standards has been reasonably successful. But scientific communities, as with all others, ultimately want their solutions to be widely accepted and used. And to this end there is a crying need to explore all possible interoperability patterns without restricting the choices to mandated standards. Standards are created by a slow and deliberative process that sometimes takes a long time to come to fruition and therefore sometime feel to fall short of user expectations. It seems therefore that organizations are left with a series of perceived orthogonal requirements when they want to pursue interoperability. They want a robust but agile solution, a mature approach that also needs to satisfy latest technology trends and so on. Sustainable interoperability patterns need to be forward looking and should choose the patterns and paradigms of the Web 2.0 generation. To this end, the key is to choose platform technologies that embrace multiple interoperability mechanisms that are built on fundamental "open" principles and which align with popular mainstream patterns. We seek to explore data-, metadata- and web service-related interoperability patterns through the prism of building solutions that encourage strong implementer and end-user engagement, improved usability and scalability considerations, and appealing developer frameworks that can grow the audience. The path to tread is not new, and the geocommunity only needs to observe and align its end goals with current Web 2.0 patterns to realize all the benefits that today we all take for granted as part of our everyday use of technology.
Ryan, Amanda; Eklund, Peter
2008-01-01
Healthcare information is composed of many types of varying and heterogeneous data. Semantic interoperability in healthcare is especially important when all these different types of data need to interact. Presented in this paper is a solution to interoperability in healthcare based on a standards-based middleware software architecture used in enterprise solutions. This architecture has been translated into the healthcare domain using a messaging and modeling standard which upholds the ideals of the Semantic Web (HL7 V3) combined with a well-known standard terminology of clinical terms (SNOMED CT).
WebTag: Web browsing into sensor tags over NFC.
Echevarria, Juan Jose; Ruiz-de-Garibay, Jonathan; Legarda, Jon; Alvarez, Maite; Ayerbe, Ana; Vazquez, Juan Ignacio
2012-01-01
Information and Communication Technologies (ICTs) continue to overcome many of the challenges related to wireless sensor monitoring, such as for example the design of smarter embedded processors, the improvement of the network architectures, the development of efficient communication protocols or the maximization of the life cycle autonomy. This work tries to improve the communication link of the data transmission in wireless sensor monitoring. The upstream communication link is usually based on standard IP technologies, but the downstream side is always masked with the proprietary protocols used for the wireless link (like ZigBee, Bluetooth, RFID, etc.). This work presents a novel solution (WebTag) for a direct IP based access to a sensor tag over the Near Field Communication (NFC) technology for secure applications. WebTag allows a direct web access to the sensor tag by means of a standard web browser, it reads the sensor data, configures the sampling rate and implements IP based security policies. It is, definitely, a new step towards the evolution of the Internet of Things paradigm.
WebTag: Web Browsing into Sensor Tags over NFC
Echevarria, Juan Jose; Ruiz-de-Garibay, Jonathan; Legarda, Jon; Álvarez, Maite; Ayerbe, Ana; Vazquez, Juan Ignacio
2012-01-01
Information and Communication Technologies (ICTs) continue to overcome many of the challenges related to wireless sensor monitoring, such as for example the design of smarter embedded processors, the improvement of the network architectures, the development of efficient communication protocols or the maximization of the life cycle autonomy. This work tries to improve the communication link of the data transmission in wireless sensor monitoring. The upstream communication link is usually based on standard IP technologies, but the downstream side is always masked with the proprietary protocols used for the wireless link (like ZigBee, Bluetooth, RFID, etc.). This work presents a novel solution (WebTag) for a direct IP based access to a sensor tag over the Near Field Communication (NFC) technology for secure applications. WebTag allows a direct web access to the sensor tag by means of a standard web browser, it reads the sensor data, configures the sampling rate and implements IP based security policies. It is, definitely, a new step towards the evolution of the Internet of Things paradigm. PMID:23012511
A BPMN solution for chaining OGC services to quality assure location-based crowdsourced data
NASA Astrophysics Data System (ADS)
Meek, Sam; Jackson, Mike; Leibovici, Didier G.
2016-02-01
The Open Geospatial Consortium (OGC) Web Processing Service (WPS) standard enables access to a centralized repository of processes and services from compliant clients. A crucial part of the standard includes the provision to chain disparate processes and services to form a reusable workflow. To date this has been realized by methods such as embedding XML requests, using Business Process Execution Language (BPEL) engines and other external orchestration engines. Although these allow the user to define tasks and data artifacts as web services, they are often considered inflexible and complicated, often due to vendor specific solutions and inaccessible documentation. This paper introduces a new method of flexible service chaining using the standard Business Process Markup Notation (BPMN). A prototype system has been developed upon an existing open source BPMN suite to illustrate the advantages of the approach. The motivation for the software design is qualification of crowdsourced data for use in policy-making. The software is tested as part of a project that seeks to qualify, assure, and add value to crowdsourced data in a biological monitoring use case.
Standards-based sensor interoperability and networking SensorWeb: an overview
NASA Astrophysics Data System (ADS)
Bolling, Sam
2012-06-01
The War fighter lacks a unified Intelligence, Surveillance, and Reconnaissance (ISR) environment to conduct mission planning, command and control (C2), tasking, collection, exploitation, processing, and data discovery of disparate sensor data across the ISR Enterprise. Legacy sensors and applications are not standardized or integrated for assured, universal access. Existing tasking and collection capabilities are not unified across the enterprise, inhibiting robust C2 of ISR including near-real time, cross-cueing operations. To address these critical needs, the National Measurement and Signature Intelligence (MASINT) Office (NMO), and partnering Combatant Commands and Intelligence Agencies are developing SensorWeb, an architecture that harmonizes heterogeneous sensor data to a common standard for users to discover, access, observe, subscribe to and task sensors. The SensorWeb initiative long term goal is to establish an open commercial standards-based, service-oriented framework to facilitate plug and play sensors. The current development effort will produce non-proprietary deliverables, intended as a Government off the Shelf (GOTS) solution to address the U.S. and Coalition nations' inability to quickly and reliably detect, identify, map, track, and fully understand security threats and operational activities.
Towards an e-Health Cloud Solution for Remote Regions at Bahia-Brazil.
Sarinho, V T; Mota, A O; Silva, E P
2017-12-19
This paper presents CloudMedic, an e-Health Cloud solution that manages health care services in remote regions of Bahia-Brazil. For that, six main modules: Clinic, Hospital, Supply, Administrative, Billing and Health Business Intelligence, were developed to control the health flow among health actors at health institutions. They provided database model and procedures for health business rules, a standard gateway for data maintenance between web views and database layer, and a multi-front-end framework based on web views and web commands configurations. These resources were used by 2042 health actors in 261 health posts covering health demands from 118 municipalities at Bahia state. They also managed approximately 2.4 million health service 'orders and approximately 13.5 million health exams for more than 1.3 million registered patients. As a result, a collection of health functionalities available in a cloud infrastructure was successfully developed, deployed and validated in more than 28% of Bahia municipalities. A viable e-Health Cloud solution that, despite municipality limitations in remote regions, decentralized and improved the access to health care services at Bahia state.
A Conditions Data Management System for HEP Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laycock, P. J.; Dykstra, D.; Formica, A.
Conditions data infrastructure for both ATLAS and CMS have to deal with the management of several Terabytes of data. Distributed computing access to this data requires particular care and attention to manage request-rates of up to several tens of kHz. Thanks to the large overlap in use cases and requirements, ATLAS and CMS have worked towards a common solution for conditions data management with the aim of using this design for data-taking in Run 3. In the meantime other experiments, including NA62, have expressed an interest in this cross- experiment initiative. For experiments with a smaller payload volume and complexity,more » there is particular interest in simplifying the payload storage. The conditions data management model is implemented in a small set of relational database tables. A prototype access toolkit consisting of an intermediate web server has been implemented, using standard technologies available in the Java community. Access is provided through a set of REST services for which the API has been described in a generic way using standard Open API specications, implemented in Swagger. Such a solution allows the automatic generation of client code and server stubs and further allows changes in the backend technology transparently. An important advantage of using a REST API for conditions access is the possibility of caching identical URLs, addressing one of the biggest challenges that large distributed computing solutions impose on conditions data access, avoiding direct DB access by means of standard web proxy solutions.« less
Advancements in Open Geospatial Standards for Photogrammetry and Remote Sensing from Ogc
NASA Astrophysics Data System (ADS)
Percivall, George; Simonis, Ingo
2016-06-01
The necessity of open standards for effective sharing and use of remote sensing continues to receive increasing emphasis in policies of agencies and projects around the world. Coordination on the development of open standards for geospatial information is a vital step to insure that the technical standards are ready to support the policy objectives. The mission of the Open Geospatial Consortium (OGC) is to advance development and use of international standards and supporting services that promote geospatial interoperability. To accomplish this mission, OGC serves as the global forum for the collaboration of geospatial data / solution providers and users. Photogrammetry and remote sensing are sources of the largest and most complex geospatial information. Some of the most mature OGC standards for remote sensing include the Sensor Web Enablement (SWE) standards, the Web Coverage Service (WCS) suite of standards, encodings such as NetCDF, GMLJP2 and GeoPackage, and the soon to be approved Discrete Global Grid Systems (DGGS) standard. In collaboration with ISPRS, OGC working with government, research and industrial organizations continue to advance the state of geospatial standards for full use of photogrammetry and remote sensing.
Disaster relief through composite signatures
NASA Astrophysics Data System (ADS)
Hawley, Chadwick T.; Hyde, Brian; Carpenter, Tom; Nichols, Steve
2012-06-01
A composite signature is a group of signatures that are related in such a way to more completely or further define a target or operational endeavor at a higher fidelity. This paper builds on previous work developing innovative composite signatures associated with civil disasters, including physical, chemical and pattern/behavioral. For the composite signature approach to be successful it requires effective data fusion and visualization. This plays a key role in both preparedness and the response and recovery which are critical to saving lives. Visualization tools enhance the overall understanding of the crisis by pulling together and analyzing the data, and providing a clear and complete analysis of the information to the organizations/agencies dependant on it for a successful operation. An example of this, Freedom Web, is an easy-to-use data visualization and collaboration solution for use in homeland security, emergency preparedness, situational awareness, and event management. The solution provides a nationwide common operating picture for all levels of government through a web based, map interface. The tool was designed to be utilized by non-geospatial experts and is easily tailored to the specific needs of the users. Consisting of standard COTS and open source databases and a web server, users can view, edit, share, and highlight information easily and quickly through a standard internet browser.
Flexible solution for interoperable cloud healthcare systems.
Vida, Mihaela Marcella; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara; Bernad, Elena
2012-01-01
It is extremely important for the healthcare domain to have a standardized communication because will improve the quality of information and in the end the resulting benefits will improve the quality of patients' life. The standards proposed to be used are: HL7 CDA and CCD. For a better access to the medical data a solution based on cloud computing (CC) is investigated. CC is a technology that supports flexibility, seamless care, and reduced costs of the medical act. To ensure interoperability between healthcare information systems a solution creating a Web Custom Control is presented. The control shows the database tables and fields used to configure the two standards. This control will facilitate the work of the medical staff and hospital administrators, because they can configure the local system easily and prepare it for communication with other systems. The resulted information will have a higher quality and will provide knowledge that will support better patient management and diagnosis.
Investigating Methods for Serving Visualizations of Vertical Profiles
NASA Astrophysics Data System (ADS)
Roberts, J. T.; Cechini, M. F.; Lanjewar, K.; Rodriguez, J.; Boller, R. A.; Baynes, K.
2017-12-01
Several geospatial web servers, web service standards, and mapping clients exist for the visualization of two-dimensional raster and vector-based Earth science data products. However, data products with a vertical component (i.e., vertical profiles) do not have the same mature set of technologies and pose a greater technical challenge when it comes to visualizations. There are a variety of tools and proposed standards, but no obvious solution that can handle the variety of visualizations found with vertical profiles. An effort is being led by members of the NASA Global Imagery Browse Services (GIBS) team to gather a list of technologies relevant to existing vertical profile data products and user stories. The goal is to find a subset of technologies, standards, and tools that can be used to build publicly accessible web services that can handle the greatest number of use cases for the widest audience possible. This presentation will describe results of the investigation and offer directions for moving forward with building a system that is capable of effectively and efficiently serving visualizations of vertical profiles.
Web-based Communication of Water Quality Issues and Potential Solution Exploration
Many United States water bodies are impaired, i.e., do not meet applicable water quality standards. Pollutants enter water bodies from point sources (PS) and non-point sources (NPS). Loadings from PS are regulated by the Clean Water Act and permits limit them. Loadings from NPS a...
Enriching and improving the quality of linked data with GIS
NASA Astrophysics Data System (ADS)
Iwaniak, Adam; Kaczmarek, Iwona; Strzelecki, Marek; Lukowicz, Jaromar; Jankowski, Piotr
2016-06-01
Standardization of methods for data exchange in GIS has along history predating the creation of World Wide Web. The advent of World Wide Web brought the emergence of new solutions for data exchange and sharing including; more recently, standards proposed by the W3C for data exchange involving Semantic Web technologies and linked data. Despite the growing interest in integration, GIS and linked data are still two separate paradigms for describing and publishing spatial data on the Web. At the same time, both paradigms offer complementary ways of representing real world phenomena and means of analysis using different processing functions. The complementarity of linked data and GIS can be leveraged to synergize both paradigms resulting in richer data content and more powerful inferencing. The article presents an approach aimed at integrating linked data with GIS. The approach relies on the use of GIS tools for integration, verification and enrichment of linked data. The GIS tools are employed to enrich linked data by furnishing access to collection of data resources, defining relationship between data resources, and subsequently facilitating GIS data integration with linked data. The proposed approach is demonstrated with examples using data from DBpedia, OSM, and tools developed by the authors for standard GIS software.
A model-driven approach for representing clinical archetypes for Semantic Web environments.
Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto
2009-02-01
The life-long clinical information of any person supported by electronic means configures his Electronic Health Record (EHR). This information is usually distributed among several independent and heterogeneous systems that may be syntactically or semantically incompatible. There are currently different standards for representing and exchanging EHR information among different systems. In advanced EHR approaches, clinical information is represented by means of archetypes. Most of these approaches use the Archetype Definition Language (ADL) to specify archetypes. However, ADL has some drawbacks when attempting to perform semantic activities in Semantic Web environments. In this work, Semantic Web technologies are used to specify clinical archetypes for advanced EHR architectures. The advantages of using the Ontology Web Language (OWL) instead of ADL are described and discussed in this work. Moreover, a solution combining Semantic Web and Model-driven Engineering technologies is proposed to transform ADL into OWL for the CEN EN13606 EHR architecture.
Web accessibility and open source software.
Obrenović, Zeljko
2009-07-01
A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.
NASA Astrophysics Data System (ADS)
Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.
2005-12-01
As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial Web Feature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.
Presentation accuracy of the web revisited: animation methods in the HTML5 era.
Garaizar, Pablo; Vadillo, Miguel A; López-de-Ipiña, Diego
2014-01-01
Using the Web to run behavioural and social experiments quickly and efficiently has become increasingly popular in recent years, but there is some controversy about the suitability of using the Web for these objectives. Several studies have analysed the accuracy and precision of different web technologies in order to determine their limitations. This paper updates the extant evidence about presentation accuracy and precision of the Web and extends the study of the accuracy and precision in the presentation of multimedia stimuli to HTML5-based solutions, which were previously untested. The accuracy and precision in the presentation of visual content in classic web technologies is acceptable for use in online experiments, although some results suggest that these technologies should be used with caution in certain circumstances. Declarative animations based on CSS are the best alternative when animation intervals are above 50 milliseconds. The performance of procedural web technologies based on the HTML5 standard is similar to that of previous web technologies. These technologies are being progressively adopted by the scientific community and have promising futures, which makes their use advisable to utilizing more obsolete technologies.
A web-based solution for 3D medical image visualization
NASA Astrophysics Data System (ADS)
Hou, Xiaoshuai; Sun, Jianyong; Zhang, Jianguo
2015-03-01
In this presentation, we present a web-based 3D medical image visualization solution which enables interactive large medical image data processing and visualization over the web platform. To improve the efficiency of our solution, we adopt GPU accelerated techniques to process images on the server side while rapidly transferring images to the HTML5 supported web browser on the client side. Compared to traditional local visualization solution, our solution doesn't require the users to install extra software or download the whole volume dataset from PACS server. By designing this web-based solution, it is feasible for users to access the 3D medical image visualization service wherever the internet is available.
Geospatial Brokering - Challenges and Future Directions
NASA Astrophysics Data System (ADS)
White, C. E.
2012-12-01
An important feature of many brokers is to facilitate straightforward human access to scientific data while maintaining programmatic access to it for system solutions. Standards-based protocols are critical for this, and there are a number of protocols to choose from. In this discussion, we will present a web application solution that leverages certain protocols - e.g., OGC CSW, REST, and OpenSearch - to provide programmatic as well as human access to geospatial resources. We will also discuss managing resources to reduce duplication yet increase discoverability, federated search solutions, and architectures that combine human-friendly interfaces with powerful underlying data management. The changing requirements witnessed in brokering solutions over time, our recent experience participating in the EarthCube brokering hack-a-thon, and evolving interoperability standards provide insight to future technological and philosophical directions planned for geospatial broker solutions. There has been much change over the past decade, but with the unprecedented data collaboration of recent years, in many ways the challenges and opportunities are just beginning.
EarthServer - 3D Visualization on the Web
NASA Astrophysics Data System (ADS)
Wagner, Sebastian; Herzig, Pasquale; Bockholt, Ulrich; Jung, Yvonne; Behr, Johannes
2013-04-01
EarthServer (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, is a project to enable the management, access and exploration of massive, multi-dimensional datasets using Open GeoSpatial Consortium (OGC) query and processing language standards like WCS 2.0 and WCPS. To this end, a server/client architecture designed to handle Petabyte/Exabyte volumes of multi-dimensional data is being developed and deployed. As an important part of the EarthServer project, six Lighthouse Applications, major scientific data exploitation initiatives, are being established to make cross-domain, Earth Sciences related data repositories available in an open and unified manner, as service endpoints based on solutions and infrastructure developed within the project. Clients technology developed and deployed in EarthServer ranges from mobile and web clients to immersive virtual reality systems, all designed to interact with a physically and logically distributed server infrastructure using exclusively OGC standards. In this contribution, we would like to present our work on a web-based 3D visualization and interaction client for Earth Sciences data using only technology found in standard web browsers without requiring the user to install plugins or addons. Additionally, we are able to run the earth data visualization client on a wide range of different platforms with very different soft- and hardware requirements such as smart phones (e.g. iOS, Android), different desktop systems etc. High-quality, hardware-accelerated visualization of 3D and 4D content in standard web browsers can be realized now and we believe it will become more and more common to use this fast, lightweight and ubiquitous platform to provide insights into big datasets without requiring the user to set up a specialized client first. With that in mind, we will also point out some of the limitations we encountered using current web technologies. Underlying the EarthServer web client and on top of HTML5, WebGL and JavaScript we have developed the X3DOM framework (www.x3dom.org), which makes possible to embed declarative X3D scenegraphs, an ISO standard XML-based file format for representing 3D computer graphics, directly within HTML, thus enabling developers to rapidly design 3D content that blends seamlessly into HTML interfaces using Javascript. This approach (commonly referred to as a polyfill layer) is used to mimic native web browser support for declarative 3D content and is an important component in our web client architecture.
NASA Astrophysics Data System (ADS)
Delipetrev, Blagoj
2016-04-01
Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.
SCHeMA web-based observation data information system
NASA Astrophysics Data System (ADS)
Novellino, Antonio; Benedetti, Giacomo; D'Angelo, Paolo; Confalonieri, Fabio; Massa, Francesco; Povero, Paolo; Tercier-Waeber, Marie-Louise
2016-04-01
It is well recognized that the need of sharing ocean data among non-specialized users is constantly increasing. Initiatives that are built upon international standards will contribute to simplify data processing and dissemination, improve user-accessibility also through web browsers, facilitate the sharing of information across the integrated network of ocean observing systems; and ultimately provide a better understanding of the ocean functioning. The SCHeMA (Integrated in Situ Chemical MApping probe) Project is developing an open and modular sensing solution for autonomous in situ high resolution mapping of a wide range of anthropogenic and natural chemical compounds coupled to master bio-physicochemical parameters (www.schema-ocean.eu). The SCHeMA web system is designed to ensure user-friendly data discovery, access and download as well as interoperability with other projects through a dedicated interface that implements the Global Earth Observation System of Systems - Common Infrastructure (GCI) recommendations and the international Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE) standards. This approach will insure data accessibility in compliance with major European Directives and recommendations. Being modular, the system allows the plug-and-play of commercially available probes as well as new sensor probess under development within the project. The access to the network of monitoring probes is provided via a web-based system interface that, being implemented as a SOS (Sensor Observation Service), is providing standard interoperability and access tosensor observations systems through O&M standard - as well as sensor descriptions - encoded in Sensor Model Language (SensorML). The use of common vocabularies in all metadatabases and data formats, to describe data in an already harmonized and common standard is a prerequisite towards consistency and interoperability. Therefore, the SCHeMA SOS has adopted the SeaVox common vocabularies populated by SeaDataNet network of National Oceanographic Data Centres. The SCHeMA presentation layer, a fundamental part of the software architecture, offers to the user a bidirectional interaction with the integrated system allowing to manage and configure the sensor probes; view the stored observations and metadata, and handle alarms. The overall structure of the web portal developed within the SCHeMA initiative (Sensor Configuration, development of Core Profile interface for data access via OGC standard, external services such as web services, WMS, WFS; and Data download and query manager) will be presented and illustrated with examples of ongoing tests in costal and open sea.
New IEEE 11073 Standards for interoperable, networked Point-of-Care Medical Devices.
Kasparick, Martin; Schlichting, Stefan; Golatowski, Frank; Timmermann, Dirk
2015-08-01
Surgical procedures become more and more complex and the number of medical devices in an operating room (OR) increases continuously. Today's vendor-dependent solutions for integrated ORs are not able to handle this complexity. They can only form isolated solutions. Furthermore, high costs are a result of vendor-dependent approaches. Thus we present a service-oriented device communication for distributed medical systems that enables the integration and interconnection between medical devices among each other and to (medical) information systems, including plug-and-play functionality. This system will improve patient's safety by making technical complexity of a comprehensive integration manageable. It will be available as open standards that are part of the IEEE 11073 family of standards. The solution consists of a service-oriented communication technology, the so called Medical Devices Profile for Web Services (MDPWS), a Domain Information & Service Model, and a binding between the first two mechanisms. A proof of this concept has been done with demonstrators of real world OR devices.
A Query Language for Handling Big Observation Data Sets in the Sensor Web
NASA Astrophysics Data System (ADS)
Autermann, Christian; Stasch, Christoph; Jirka, Simon; Koppe, Roland
2017-04-01
The Sensor Web provides a framework for the standardized Web-based sharing of environmental observations and sensor metadata. While the issue of varying data formats and protocols is addressed by these standards, the fast growing size of observational data is imposing new challenges for the application of these standards. Most solutions for handling big observational datasets currently focus on remote sensing applications, while big in-situ datasets relying on vector features still lack a solid approach. Conventional Sensor Web technologies may not be adequate, as the sheer size of the data transmitted and the amount of metadata accumulated may render traditional OGC Sensor Observation Services (SOS) unusable. Besides novel approaches to store and process observation data in place, e.g. by harnessing big data technologies from mainstream IT, the access layer has to be amended to utilize and integrate these large observational data archives into applications and to enable analysis. For this, an extension to the SOS will be discussed that establishes a query language to dynamically process and filter observations at storage level, similar to the OGC Web Coverage Service (WCS) and it's Web Coverage Processing Service (WCPS) extension. This will enable applications to request e.g. spatial or temporal aggregated data sets in a resolution it is able to display or it requires. The approach will be developed and implemented in cooperation with the The Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research whose catalogue of data compromises marine observations of physical, chemical and biological phenomena from a wide variety of sensors, including mobile (like research vessels, aircrafts or underwater vehicles) and stationary (like buoys or research stations). Observations are made with a high temporal resolution and the resulting time series may span multiple decades.
Collaborative Action Research: A Democratic Undertaking or a Web of Collusion and Compliance?
ERIC Educational Resources Information Center
Jones, Marion; Stanley, Grant
2010-01-01
Raising standards in education has been the mantra for educational stakeholders in England for the past two decades and has informed national, regional and local agendas for school improvement. In the relentless pursuit of finding solutions to pedagogical problems, action research has been promoted as an effective strategy. Informed by an…
Web Solutions Inspire Cloud Computing Software
NASA Technical Reports Server (NTRS)
2013-01-01
An effort at Ames Research Center to standardize NASA websites unexpectedly led to a breakthrough in open source cloud computing technology. With the help of Rackspace Inc. of San Antonio, Texas, the resulting product, OpenStack, has spurred the growth of an entire industry that is already employing hundreds of people and generating hundreds of millions in revenue.
Presentation Accuracy of the Web Revisited: Animation Methods in the HTML5 Era
Garaizar, Pablo; Vadillo, Miguel A.; López-de-Ipiña, Diego
2014-01-01
Using the Web to run behavioural and social experiments quickly and efficiently has become increasingly popular in recent years, but there is some controversy about the suitability of using the Web for these objectives. Several studies have analysed the accuracy and precision of different web technologies in order to determine their limitations. This paper updates the extant evidence about presentation accuracy and precision of the Web and extends the study of the accuracy and precision in the presentation of multimedia stimuli to HTML5-based solutions, which were previously untested. The accuracy and precision in the presentation of visual content in classic web technologies is acceptable for use in online experiments, although some results suggest that these technologies should be used with caution in certain circumstances. Declarative animations based on CSS are the best alternative when animation intervals are above 50 milliseconds. The performance of procedural web technologies based on the HTML5 standard is similar to that of previous web technologies. These technologies are being progressively adopted by the scientific community and have promising futures, which makes their use advisable to utilizing more obsolete technologies. PMID:25302791
Solution Kinetics Database on the Web
National Institute of Standards and Technology Data Gateway
SRD 40 NDRL/NIST Solution Kinetics Database on the Web (Web, free access) Data for free radical processes involving primary radicals from water, inorganic radicals and carbon-centered radicals in solution, and singlet oxygen and organic peroxyl radicals in various solvents.
GeoNetwork powered GI-cat: a geoportal hybrid solution
NASA Astrophysics Data System (ADS)
Baldini, Alessio; Boldrini, Enrico; Santoro, Mattia; Mazzetti, Paolo
2010-05-01
To the aim of setting up a Spatial Data Infrastructures (SDI) the creation of a system for the metadata management and discovery plays a fundamental role. An effective solution is the use of a geoportal (e.g. FAO/ESA geoportal), that has the important benefit of being accessible from a web browser. With this work we present a solution based integrating two of the available frameworks: GeoNetwork and GI-cat. GeoNetwork is an opensource software designed to improve accessibility of a wide variety of data together with the associated ancillary information (metadata), at different scale and from multidisciplinary sources; data are organized and documented in a standard and consistent way. GeoNetwork implements both the Portal and Catalog components of a Spatial Data Infrastructure (SDI) defined in the OGC Reference Architecture. It provides tools for managing and publishing metadata on spatial data and related services. GeoNetwork allows harvesting of various types of web data sources e.g. OGC Web Services (e.g. CSW, WCS, WMS). GI-cat is a distributed catalog based on a service-oriented framework of modular components and can be customized and tailored to support different deployment scenarios. It can federate a multiplicity of catalogs services, as well as inventory and access services in order to discover and access heterogeneous ESS resources. The federated resources are exposed by GI-cat through several standard catalog interfaces (e.g. OGC CSW AP ISO, OpenSearch, etc.) and by the GI-cat extended interface. Specific components implement mediation services for interfacing heterogeneous service providers, each of which exposes a specific standard specification; such components are called Accessors. These mediating components solve providers data modelmultiplicity by mapping them onto the GI-cat internal data model which implements the ISO 19115 Core profile. Accessors also implement the query protocol mapping; first they translate the query requests expressed according to the interface protocols exposed by GI-cat into the multiple query dialects spoken by the resource service providers. Currently, a number of well-accepted catalog and inventory services are supported, including several OGC Web Services, THREDDS Data Server, SeaDataNet Common Data Index, GBIF and OpenSearch engines. A GeoNetwork powered GI-cat has been developed in order to exploit the best of the two frameworks. The new system uses a modified version of GeoNetwork web interface in order to add the capability of querying also the specified GI-cat catalog and not only the GeoNetwork internal database. The resulting system consists in a geoportal in which GI-cat plays the role of the search engine. This new system allows to distribute the query on the different types of data sources linked to a GI-cat. The metadata results of the query are then visualized by the Geonetwork web interface. This configuration was experimented in the framework of GIIDA, a project of the Italian National Research Council (CNR) focused on data accessibility and interoperability. A second advantage of this solution is achieved setting up a GeoNetwork catalog amongst the accessors of the GI-cat instance. Such a configuration will allow in turn GI-cat to run the query against the internal GeoNetwork database. This allows to have both the harvesting and the metadata editor functionalities provided by GeoNetwork and the distributed search functionality of GI-cat available in a consistent way through the same web interface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, M Pauline
2007-06-30
The VisPort visualization portal is an experiment in providing Web-based access to visualization functionality from any place and at any time. VisPort adopts a service-oriented architecture to encapsulate visualization functionality and to support remote access. Users employ browser-based client applications to choose data and services, set parameters, and launch visualization jobs. Visualization products typically images or movies are viewed in the user's standard Web browser. VisPort emphasizes visualization solutions customized for specific application communities. Finally, VisPort relies heavily on XML, and introduces the notion of visualization informatics - the formalization and specialization of information related to the process and productsmore » of visualization.« less
Comparison: Mediation Solutions of WSMOLX and WebML/WebRatio
NASA Astrophysics Data System (ADS)
Zaremba, Maciej; Zaharia, Raluca; Turati, Andrea; Brambilla, Marco; Vitvar, Tomas; Ceri, Stefano
In this chapter we compare the WSMO/WSML/WSMX andWebML/WebRatio approaches to the SWS-Challenge workshop mediation scenario in terms of the utilized underlying technologies and delivered solutions. In the mediation scenario one partner uses Roset-taNet to define its B2B protocol while the other one operates on a proprietary solution. Both teams shown how these partners could be semantically integrated.
UkrVO astronomical WEB services
NASA Astrophysics Data System (ADS)
Mazhaev, A.
2017-02-01
Ukraine Virtual Observatory (UkrVO) has been a member of the International Virtual Observatory Alliance (IVOA) since 2011. The virtual observatory (VO) is not a magic solution to all problems of data storing and processing, but it provides certain standards for building infrastructure of astronomical data center. The astronomical databases help data mining and offer to users an easy access to observation metadata, images within celestial sphere and results of image processing. The astronomical web services (AWS) of UkrVO give to users handy tools for data selection from large astronomical catalogues for a relatively small region of interest in the sky. Examples of the AWS usage are showed.
Towards Web-based representation and processing of health information
Gao, Sheng; Mioc, Darka; Yi, Xiaolun; Anton, Francois; Oldfield, Eddie; Coleman, David J
2009-01-01
Background There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data at their fingertips. Increasingly complex problems in the health field require increasingly sophisticated computer software, distributed computing power, and standardized data sharing. To address this need, Web-based mapping is now emerging as an important tool to enable health practitioners, policy makers, and the public to understand spatial health risks, population health trends and vulnerabilities. Today several web-based health applications generate dynamic maps; however, for people to fully interpret the maps they need data source description and the method used in the data analysis or statistical modeling. For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been adequately applied to support flexible health data processing and mapping online. Results The authors of this study designed a HEalth Representation XML (HERXML) schema that consists of the semantic (e.g., health activity description, the data sources description, the statistical methodology used for analysis), geometric, and cartographical representations of health data. A case study has been carried on the development of web application and services within the Canadian Geospatial Data Infrastructure (CGDI) framework for community health programs of the New Brunswick Lung Association. This study facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion The designed HERXML has been proven to be an appropriate solution in supporting the Web representation of health information. It can be used by health practitioners, policy makers, and the public in disease etiology, health planning, health resource management, health promotion and health education. The utilization of Web-based processing services in this study provides a flexible way for users to select and use certain processing functions for health data processing and mapping via the Web. This research provides easy access to geospatial and health data in understanding the trends of diseases, and promotes the growth and enrichment of the CGDI in the public health sector. PMID:19159445
Decentralized Orchestration of Composite Ogc Web Processing Services in the Cloud
NASA Astrophysics Data System (ADS)
Xiao, F.; Shea, G. Y. K.; Cao, J.
2016-09-01
Current web-based GIS or RS applications generally rely on centralized structure, which has inherent drawbacks such as single points of failure, network congestion, and data inconsistency, etc. The inherent disadvantages of traditional GISs need to be solved for new applications on Internet or Web. Decentralized orchestration offers performance improvements in terms of increased throughput and scalability and lower response time. This paper investigates build time and runtime issues related to decentralized orchestration of composite geospatial processing services based on OGC WPS standard specification. A case study of dust storm detection was demonstrated to evaluate the proposed method and the experimental results indicate that the method proposed in this study is effective for its ability to produce the high quality solution at a low cost of communications for geospatial processing service composition problem.
Semantic Web meets Integrative Biology: a survey.
Chen, Huajun; Yu, Tong; Chen, Jake Y
2013-01-01
Integrative Biology (IB) uses experimental or computational quantitative technologies to characterize biological systems at the molecular, cellular, tissue and population levels. IB typically involves the integration of the data, knowledge and capabilities across disciplinary boundaries in order to solve complex problems. We identify a series of bioinformatics problems posed by interdisciplinary integration: (i) data integration that interconnects structured data across related biomedical domains; (ii) ontology integration that brings jargons, terminologies and taxonomies from various disciplines into a unified network of ontologies; (iii) knowledge integration that integrates disparate knowledge elements from multiple sources; (iv) service integration that build applications out of services provided by different vendors. We argue that IB can benefit significantly from the integration solutions enabled by Semantic Web (SW) technologies. The SW enables scientists to share content beyond the boundaries of applications and websites, resulting into a web of data that is meaningful and understandable to any computers. In this review, we provide insight into how SW technologies can be used to build open, standardized and interoperable solutions for interdisciplinary integration on a global basis. We present a rich set of case studies in system biology, integrative neuroscience, bio-pharmaceutics and translational medicine, to highlight the technical features and benefits of SW applications in IB.
Access and privacy rights using web security standards to increase patient empowerment.
Falcão-Reis, Filipa; Costa-Pereira, Altamiro; Correia, Manuel E
2008-01-01
Electronic Health Record (EHR) systems are becoming more and more sophisticated and include nowadays numerous applications, which are not only accessed by medical professionals, but also by accounting and administrative personnel. This could represent a problem concerning basic rights such as privacy and confidentiality. The principles, guidelines and recommendations compiled by the OECD protection of privacy and trans-border flow of personal data are described and considered within health information system development. Granting access to an EHR should be dependent upon the owner of the record; the patient: he must be entitled to define who is allowed to access his EHRs, besides the access control scheme each health organization may have implemented. In this way, it's not only up to health professionals to decide who have access to what, but the patient himself. Implementing such a policy is walking towards patient empowerment which society should encourage and governments should promote. The paper then introduces a technical solution based on web security standards. This would give patients the ability to monitor and control which entities have access to their personal EHRs, thus empowering them with the knowledge of how much of his medical history is known and by whom. It is necessary to create standard data access protocols, mechanisms and policies to protect the privacy rights and furthermore, to enable patients, to automatically track the movement (flow) of their personal data and information in the context of health information systems. This solution must be functional and, above all, user-friendly and the interface should take in consideration some heuristics of usability in order to provide the user with the best tools. The current official standards on confidentiality and privacy in health care, currently being developed within the EU, are explained, in order to achieve a consensual idea of the guidelines that all member states should follow to transfer such principles into national laws. A perspective is given on the state of the art concerning web security standards, which can be used to easily engineer health information systems complying with the patient empowering goals. In conclusion health systems with the characteristics thus described are technically feasible and should be generally implemented and deployed.
Lowering the Barrier for Standards-Compliant and Discoverable Hydrological Data Publication
NASA Astrophysics Data System (ADS)
Kadlec, J.
2013-12-01
The growing need for sharing and integration of hydrological and climate data across multiple organizations has resulted in the development of distributed, services-based, standards-compliant hydrological data management and data hosting systems. The problem with these systems is complicated set-up and deployment. Many existing systems assume that the data publisher has remote-desktop access to a locally managed server and experience with computer network setup. For corporate websites, shared web hosting services with limited root access provide an inexpensive, dynamic web presence solution using the Linux, Apache, MySQL and PHP (LAMP) software stack. In this paper, we hypothesize that a webhosting service provides an optimal, low-cost solution for hydrological data hosting. We propose a software architecture of a standards-compliant, lightweight and easy-to-deploy hydrological data management system that can be deployed on the majority of existing shared internet webhosting services. The architecture and design is validated by developing Hydroserver Lite: a PHP and MySQL-based hydrological data hosting package that is fully standards-compliant and compatible with the Consortium of Universities for Advancement of Hydrologic Sciences (CUAHSI) hydrologic information system. It is already being used for management of field data collection by students of the McCall Outdoor Science School in Idaho. For testing, the Hydroserver Lite software has been installed on multiple different free and low-cost webhosting sites including Godaddy, Bluehost and 000webhost. The number of steps required to set-up the server is compared with the number of steps required to set-up other standards-compliant hydrologic data hosting systems including THREDDS, IstSOS and MapServer SOS.
Applying Sensor Web Technology to Marine Sensor Data
NASA Astrophysics Data System (ADS)
Jirka, Simon; del Rio, Joaquin; Mihai Toma, Daniel; Nüst, Daniel; Stasch, Christoph; Delory, Eric
2015-04-01
In this contribution we present two activities illustrating how Sensor Web technology helps to enable a flexible and interoperable sharing of marine observation data based on standards. An important foundation is the Sensor Web Architecture developed by the European FP7 project NeXOS (Next generation Low-Cost Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management). This architecture relies on the Open Geospatial Consortium's (OGC) Sensor Web Enablement (SWE) framework. It is an exemplary solution for facilitating the interoperable exchange of marine observation data within and between (research) organisations. The architecture addresses a series of functional and non-functional requirements which are fulfilled through different types of OGC SWE components. The diverse functionalities offered by the NeXOS Sensor Web architecture are shown in the following overview: - Pull-based observation data download: This is achieved through the OGC Sensor Observation Service (SOS) 2.0 interface standard. - Push-based delivery of observation data to allow users the subscription to new measurements that are relevant for them: For this purpose there are currently several specification activities under evaluation (e.g. OGC Sensor Event Service, OGC Publish/Subscribe Standards Working Group). - (Web-based) visualisation of marine observation data: Implemented through SOS client applications. - Configuration and controlling of sensor devices: This is ensured through the OGC Sensor Planning Service 2.0 interface. - Bridging between sensors/data loggers and Sensor Web components: For this purpose several components such as the "Smart Electronic Interface for Sensor Interoperability" (SEISI) concept are developed; this is complemented by a more lightweight SOS extension (e.g. based on the W3C Efficient XML Interchange (EXI) format). To further advance this architecture, there is on-going work to develop dedicated profiles of selected OGC SWE specifications that provide stricter guidance how these standards shall be applied to marine data (e.g. SensorML 2.0 profiles stating which metadata elements are mandatory building upon the ESONET Sensor Registry developments, etc.). Within the NeXOS project the presented architecture is implemented as a set of open source components. These implementations can be re-used by all interested scientists and data providers needing tools for publishing or consuming oceanographic sensor data. In further projects such as the European project FixO3 (Fixed-point Open Ocean Observatories), these software development activities are complemented with additional efforts to provide guidance how Sensor Web technology can be applied in an efficient manner. This way, not only software components are made available but also documentation and information resources that help to understand which types of Sensor Web deployments are best suited to fulfil different types of user requirements.
Query optimization for graph analytics on linked data using SPARQL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Seokyong; Lee, Sangkeun; Lim, Seung -Hwan
2015-07-01
Triplestores that support query languages such as SPARQL are emerging as the preferred and scalable solution to represent data and meta-data as massive heterogeneous graphs using Semantic Web standards. With increasing adoption, the desire to conduct graph-theoretic mining and exploratory analysis has also increased. Addressing that desire, this paper presents a solution that is the marriage of Graph Theory and the Semantic Web. We present software that can analyze Linked Data using graph operations such as counting triangles, finding eccentricity, testing connectedness, and computing PageRank directly on triple stores via the SPARQL interface. We describe the process of optimizing performancemore » of the SPARQL-based implementation of such popular graph algorithms by reducing the space-overhead, simplifying iterative complexity and removing redundant computations by understanding query plans. Our optimized approach shows significant performance gains on triplestores hosted on stand-alone workstations as well as hardware-optimized scalable supercomputers such as the Cray XMT.« less
Beger, Christoph; Uciteli, Alexandr; Herre, Heinrich
2017-01-01
The amount of ontologies, which are utilizable for widespread domains, is growing steadily. BioPortal alone, embraces over 500 published ontologies with nearly 8 million classes. In contrast, the vast informative content of these ontologies is only directly intelligible by experts. To overcome this deficiency it could be possible to represent ontologies as web portals, which does not require knowledge about ontologies and their semantics, but still carries as much information as possible to the end-user. Furthermore, the conception of a complex web portal is a sophisticated process. Many entities must be analyzed and linked to existing terminologies. Ontologies are a decent solution for gathering and storing this complex data and dependencies. Hence, automated imports of ontologies into web portals could support both mentioned scenarios. The Content Management System (CMS) Drupal 8 is one of many solutions to develop web presentations with less required knowledge about programming languages and it is suitable to represent ontological entities. We developed the Drupal Upper Ontology (DUO), which models concepts of Drupal's architecture, such as nodes, vocabularies and links. DUO can be imported into ontologies to map their entities to Drupal's concepts. Because of Drupal's lack of import capabilities, we implemented the Simple Ontology Loader in Drupal (SOLID), a Drupal 8 module, which allows Drupal administrators to import ontologies based on DUO. Our module generates content in Drupal from existing ontologies and makes it accessible by the general public. Moreover Drupal offers a tagging system which may be amplified with multiple standardized and established terminologies by importing them with SOLID. Our Drupal module shows that ontologies can be used to model content of a CMS and vice versa CMS are suitable to represent ontologies in a user-friendly way. Ontological entities are presented to the user as discrete pages with all appropriate properties, links and tags.
Exposing Coverage Data to the Semantic Web within the MELODIES project: Challenges and Solutions
NASA Astrophysics Data System (ADS)
Riechert, Maik; Blower, Jon; Griffiths, Guy
2016-04-01
Coverage data, typically big in data volume, assigns values to a given set of spatiotemporal positions, together with metadata on how to interpret those values. Existing storage formats like netCDF, HDF and GeoTIFF all have various restrictions that prevent them from being preferred formats for use over the web, especially the semantic web. Factors that are relevant here are the processing complexity, the semantic richness of the metadata, and the ability to request partial information, such as a subset or just the appropriate metadata. Making coverage data available within web browsers opens the door to new ways for working with such data, including new types of visualization and on-the-fly processing. As part of the European project MELODIES (http://melodiesproject.eu) we look into the challenges of exposing such coverage data in an interoperable and web-friendly way, and propose solutions using a host of emerging technologies like JSON-LD, the DCAT and GeoDCAT-AP ontologies, the CoverageJSON format, and new approaches to REST APIs for coverage data. We developed the CoverageJSON format within the MELODIES project as an additional way to expose coverage data to the web, next to having simple rendered images available using standards like OGC's WMS. CoverageJSON partially incorporates JSON-LD but does not encode individual data values as semantic resources, making use of the technology in a practical manner. The development also focused on it being a potential output format for OGC WCS. We will demonstrate how existing netCDF data can be exposed as CoverageJSON resources on the web together with a REST API that allows users to explore the data and run operations such as spatiotemporal subsetting. We will show various use cases from the MELODIES project, including reclassification of a Land Cover dataset client-side within the browser with the ability for the user to influence the reclassification result by making use of the above technologies.
Standardized acquisition, storing and provision of 3D enabled spatial data
NASA Astrophysics Data System (ADS)
Wagner, B.; Maier, S.; Peinsipp-Byma, E.
2017-05-01
In the area of working with spatial data, in addition to the classic, two-dimensional geometrical data (maps, aerial images, etc.), the needs for three-dimensional spatial data (city models, digital elevation models, etc.) is increasing. Due to this increased demand the acquiring, storing and provision of 3D enabled spatial data in Geographic Information Systems (GIS) is more and more important. Existing proprietary solutions quickly reaches their limits during data exchange and data delivery to other systems. They generate a large workload, which will be very costly. However, it is noticeable that these expenses and costs can generally be significantly reduced using standards. The aim of this research is therefore to develop a concept in the field of three-dimensional spatial data that runs on existing standards whenever possible. In this research, the military image analysts are the preferred user group of the system. To achieve the objective of the widest possible use of standards in spatial 3D data, existing standards, proprietary interfaces and standards under discussion have been analyzed. Since the here used GIS of the Fraunhofer IOSB is already using and supporting OGC (Open Geospatial Consortium) and NATO-STANAG (NATO-Standardization Agreement) standards for the most part of it, a special attention for possible use was laid on their standards. The most promising standard is the OGC standard 3DPS (3D Portrayal Service) with its occurrences W3DS (Web 3D Service) and WVS (Web View Service). A demo system was created, using a standardized workflow from the data acquiring, storing and provision and showing the benefit of our approach.
78 FR 8108 - NextGen Solutions Vendors Guide
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-05
... Commerce is developing a web-based NextGen Solutions Vendors Guide intended to be used by foreign air... being listed on the Vendors Guide Web site should submit their company's name, Web site address, contact... to aviation system upgrades) Example: Engineering Services More information on the four ICAO ASBU...
Inactivation of Biological Agents Using Neutral Oxone-Chloride Solutions
2006-01-31
33.50 2006 American Chemical Society VOL. 40, NO. 8, 2006 / ENVIRONMENTAL SCIENCE & TECHNOLOGY 9 2759 Published on Web 03/15/2006 Standard Form 298...Approved for public release; distribution unlimited. Published in Environmental Science and Technology, Vol 40, No 8, pp 2759-2764. AFRL/MLQ Public Affairs...Inactivation Studies. Greater than 8-log inactiva- tion of E. coli was obtained within 30 s upon exposure to 2760 9 ENVIRONMENTAL SCIENCE & TECHNOLOGY
Moisil, Ioana; Barbat, Boldur E
2004-01-01
Romanian healthcare is facing a number of challenges, from the growing general costs, through requests for better services, inadequate territorial coverage, medical errors and a growing incidence of chronic diseases, to the burden of debt toward the pharmaceutical industry. For the last 14 years decision factors have been searching for the magic formula in restructuring the healthcare sector. Eventually, the government has come to appreciate the benefits of IT solutions. Our paper presents recent advances in wireless technologies and their impact on healthcare, in parallel with the results of a study aimed to acknowledge the presence of the medical community on Romanian WWW and to evaluate the degree of accessibility for the general population. We have documented Web sites promoting health services, discussion forums for patients, online medical advice, medical image teleprocessing, health education, health research and documentation, pharmaceutical products, e-procurement, health portals, medical links, hospitals and other health units present on the Web. Initial results have shown that if the current trend in price decreases for mobile communications continues and if the government is able to provide funding for the communication infrastructure needed for pervasive healthcare systems together with the appropriate regulations and standards, this can be a long-term viable solution of the healthcare crisis.
Search, Read and Write: An Inquiry into Web Accessibility for People with Dyslexia.
Berget, Gerd; Herstad, Jo; Sandnes, Frode Eika
2016-01-01
Universal design in context of digitalisation has become an integrated part of international conventions and national legislations. A goal is to make the Web accessible for people of different genders, ages, backgrounds, cultures and physical, sensory and cognitive abilities. Political demands for universally designed solutions have raised questions about how it is achieved in practice. Developers, designers and legislators have looked towards the Web Content Accessibility Guidelines (WCAG) for answers. WCAG 2.0 has become the de facto standard for universal design on the Web. Some of the guidelines are directed at the general population, while others are targeted at more specific user groups, such as the visually impaired or hearing impaired. Issues related to cognitive impairments such as dyslexia receive less attention, although dyslexia is prevalent in at least 5-10% of the population. Navigation and search are two common ways of using the Web. However, while navigation has received a fair amount of attention, search systems are not explicitly included, although search has become an important part of people's daily routines. This paper discusses WCAG in the context of dyslexia for the Web in general and search user interfaces specifically. Although certain guidelines address topics that affect dyslexia, WCAG does not seem to fully accommodate users with dyslexia.
Miles, Alistair; Zhao, Jun; Klyne, Graham; White-Cooper, Helen; Shotton, David
2010-10-01
Integrating heterogeneous data across distributed sources is a major requirement for in silico bioinformatics supporting translational research. For example, genome-scale data on patterns of gene expression in the fruit fly Drosophila melanogaster are widely used in functional genomic studies in many organisms to inform candidate gene selection and validate experimental results. However, current data integration solutions tend to be heavy weight, and require significant initial and ongoing investment of effort. Development of a common Web-based data integration infrastructure (a.k.a. data web), using Semantic Web standards, promises to alleviate these difficulties, but little is known about the feasibility, costs, risks or practical means of migrating to such an infrastructure. We describe the development of OpenFlyData, a proof-of-concept system integrating gene expression data on D. melanogaster, combining Semantic Web standards with light-weight approaches to Web programming based on Web 2.0 design patterns. To support researchers designing and validating functional genomic studies, OpenFlyData includes user-facing search applications providing intuitive access to and comparison of gene expression data from FlyAtlas, the BDGP in situ database, and FlyTED, using data from FlyBase to expand and disambiguate gene names. OpenFlyData's services are also openly accessible, and are available for reuse by other bioinformaticians and application developers. Semi-automated methods and tools were developed to support labour- and knowledge-intensive tasks involved in deploying SPARQL services. These include methods for generating ontologies and relational-to-RDF mappings for relational databases, which we illustrate using the FlyBase Chado database schema; and methods for mapping gene identifiers between databases. The advantages of using Semantic Web standards for biomedical data integration are discussed, as are open issues. In particular, although the performance of open source SPARQL implementations is sufficient to query gene expression data directly from user-facing applications such as Web-based data fusions (a.k.a. mashups), we found open SPARQL endpoints to be vulnerable to denial-of-service-type problems, which must be mitigated to ensure reliability of services based on this standard. These results are relevant to data integration activities in translational bioinformatics. The gene expression search applications and SPARQL endpoints developed for OpenFlyData are deployed at http://openflydata.org. FlyUI, a library of JavaScript widgets providing re-usable user-interface components for Drosophila gene expression data, is available at http://flyui.googlecode.com. Software and ontologies to support transformation of data from FlyBase, FlyAtlas, BDGP and FlyTED to RDF are available at http://openflydata.googlecode.com. SPARQLite, an implementation of the SPARQL protocol, is available at http://sparqlite.googlecode.com. All software is provided under the GPL version 3 open source license.
The GEOSS Clearinghouse based on the GeoNetwork opensource
NASA Astrophysics Data System (ADS)
Liu, K.; Yang, C.; Wu, H.; Huang, Q.
2010-12-01
The Global Earth Observation System of Systems (GEOSS) is established to support the study of the Earth system in a global community. It provides services for social management, quick response, academic research, and education. The purpose of GEOSS is to achieve comprehensive, coordinated and sustained observations of the Earth system, improve monitoring of the state of the Earth, increase understanding of Earth processes, and enhance prediction of the behavior of the Earth system. In 2009, GEO called for a competition for an official GEOSS clearinghouse to be selected as a source to consolidating catalogs for Earth observations. The Joint Center for Intelligent Spatial Computing at George Mason University worked with USGS to submit a solution based on the open-source platform - GeoNetwork. In the spring of 2010, the solution is selected as the product for GEOSS clearinghouse. The GEOSS Clearinghouse is a common search facility for the Intergovernmental Group on Ea rth Observation (GEO). By providing a list of harvesting functions in Business Logic, GEOSS clearinghouse can collect metadata from distributed catalogs including other GeoNetwork native nodes, webDAV/sitemap/WAF, catalog services for the web (CSW)2.0, GEOSS Component and Service Registry (http://geossregistries.info/), OGC Web Services (WCS, WFS, WMS and WPS), OAI Protocol for Metadata Harvesting 2.0, ArcSDE Server and Local File System. Metadata in GEOSS clearinghouse are managed in a database (MySQL, Postgresql, Oracle, or MckoiDB) and an index of the metadata is maintained through Lucene engine. Thus, EO data, services, and related resources can be discovered and accessed. It supports a variety of geospatial standards including CSW and SRU for search, FGDC and ISO metadata, and WMS related OGC standards for data access and visualization, as linked from the metadata.
Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service
Hatano, Kenji; Ohe, Kazuhiko
2003-01-01
Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364
A novel web-enabled healthcare solution on health vault system.
Liao, Lingxia; Chen, Min; Rodrigues, Joel J P C; Lai, Xiaorong; Vuong, Son
2012-06-01
Complicated Electronic Medical Records (EMR) systems have created problems in systems regarding an easy implementation and interoperability for a Web-enabled Healthcare Solution, which is normally provided by an independent healthcare giver with limited IT knowledge and interests. An EMR system with well-designed and user-friendly interface, such as Microsoft HealthVault System used as the back-end platform of a Web-enabled healthcare application will be an approach to deal with these problems. This paper analyzes the patient oriented Web-enabled healthcare service application as the new trend to delivery healthcare from hospital/clinic-centric to patient-centric, the current e-healthcare applications, and the main backend EMR systems. Then, we present a novel web-enabled healthcare solution based on Microsoft HealthVault EMR system to meet customers' needs, such as, low total cost, easily development and maintenance, and good interoperability. A sample system is given to show how the solution can be fulfilled, evaluated, and validated. We expect that this paper will provide a deep understanding of the available EMR systems, leading to insights for new solutions and approaches driven to next generation EMR systems.
Data Mining Web Services for Science Data Repositories
NASA Astrophysics Data System (ADS)
Graves, S.; Ramachandran, R.; Keiser, K.; Maskey, M.; Lynnes, C.; Pham, L.
2006-12-01
The maturation of web services standards and technologies sets the stage for a distributed "Service-Oriented Architecture" (SOA) for NASA's next generation science data processing. This architecture will allow members of the scientific community to create and combine persistent distributed data processing services and make them available to other users over the Internet. NASA has initiated a project to create a suite of specialized data mining web services designed specifically for science data. The project leverages the Algorithm Development and Mining (ADaM) toolkit as its basis. The ADaM toolkit is a robust, mature and freely available science data mining toolkit that is being used by several research organizations and educational institutions worldwide. These mining services will give the scientific community a powerful and versatile data mining capability that can be used to create higher order products such as thematic maps from current and future NASA satellite data records with methods that are not currently available. The package of mining and related services are being developed using Web Services standards so that community-based measurement processing systems can access and interoperate with them. These standards-based services allow users different options for utilizing them, from direct remote invocation by a client application to deployment of a Business Process Execution Language (BPEL) solutions package where a complex data mining workflow is exposed to others as a single service. The ability to deploy and operate these services at a data archive allows the data mining algorithms to be run where the data are stored, a more efficient scenario than moving large amounts of data over the network. This will be demonstrated in a scenario in which a user uses a remote Web-Service-enabled clustering algorithm to create cloud masks from satellite imagery at the Goddard Earth Sciences Data and Information Services Center (GES DISC).
Doctors' confusion over ratios and percentages in drug solutions: the case for standard labelling
Wheeler, Daniel Wren; Remoundos, Dionysios Dennis; Whittlestone, Kim David; Palmer, Michael Ian; Wheeler, Sarah Jane; Ringrose, Timothy Richard; Menon, David Krishna
2004-01-01
The different ways of expressing concentrations of drugs in solution, as ratios or percentages or mass per unit volume, are a potential cause of confusion that may contribute to dose errors. To assess doctors' understanding of what they signify, all active subscribers to doctors.net.uk, an online community exclusively for UK doctors, were invited to complete a brief web-based multiple-choice questionnaire that explored their familiarity with solutions of adrenaline (expressed as a ratio), lidocaine (expressed as a percentage) and atropine (expressed in mg per mL), and their ability to calculate the correct volume to administer in clinical scenarios relevant to all specialties. 2974 (24.6%) replied. The mean score achieved was 4.80 out of 6 (SD 1.38). Only 85.2% and 65.8% correctly identified the mass of drug in the adrenaline and lidocaine solutions, respectively, whilst 93.1% identified the correct concentration of atropine. More would have administered the correct volume of adrenaline and lidocaine in clinical scenarios (89.4% and 81.0%, respectively) but only 65.5% identified the correct volume of atropine. The labelling of drug solutions as ratios or percentages is antiquated and confusing. Labelling should be standardized to mass per unit volume. PMID:15286190
Embedded Web Technology: Applying World Wide Web Standards to Embedded Systems
NASA Technical Reports Server (NTRS)
Ponyik, Joseph G.; York, David W.
2002-01-01
Embedded Systems have traditionally been developed in a highly customized manner. The user interface hardware and software along with the interface to the embedded system are typically unique to the system for which they are built, resulting in extra cost to the system in terms of development time and maintenance effort. World Wide Web standards have been developed in the passed ten years with the goal of allowing servers and clients to intemperate seamlessly. The client and server systems can consist of differing hardware and software platforms but the World Wide Web standards allow them to interface without knowing about the details of system at the other end of the interface. Embedded Web Technology is the merging of Embedded Systems with the World Wide Web. Embedded Web Technology decreases the cost of developing and maintaining the user interface by allowing the user to interface to the embedded system through a web browser running on a standard personal computer. Embedded Web Technology can also be used to simplify an Embedded System's internal network.
Design, Implementation and Applications of 3d Web-Services in DB4GEO
NASA Astrophysics Data System (ADS)
Breunig, M.; Kuper, P. V.; Dittrich, A.; Wild, P.; Butwilowski, E.; Al-Doori, M.
2013-09-01
The object-oriented database architecture DB4GeO was originally designed to support sub-surface applications in the geo-sciences. This is reflected in DB4GeO's geometric data model as well as in its import and export functions. Initially, these functions were designed for communication with 3D geological modeling and visualization tools such as GOCAD or MeshLab. However, it soon became clear that DB4GeO was suitable for a much wider range of applications. Therefore it is natural to move away from a standalone solution and to open the access to DB4GeO data by standardized OGC web-services. Though REST and OGC services seem incompatible at first sight, the implementation in DB4GeO shows that OGC-based implementation of web-services may use parts of the DB4GeO-REST implementation. Starting with initial solutions in the history of DB4GeO, this paper will introduce the design, adaptation (i.e. model transformation), and first steps in the implementation of OGC Web Feature (WFS) and Web Processing Services (WPS), as new interfaces to DB4GeO data and operations. Among its capabilities, DB4GeO can provide data in different data formats like GML, GOCAD, or DB3D XML through a WFS, as well as its ability to run operations like a 3D-to-2D service, or mesh-simplification (Progressive Meshes) through a WPS. We then demonstrate, an Android-based mobile 3D augmented reality viewer for DB4GeO that uses the Web Feature Service to visualize 3D geo-database query results. Finally, we explore future research work considering DB4GeO in the framework of the research group "Computer-Aided Collaborative Subway Track Planning in Multi-Scale 3D City and Building Models".
Real-time access of large volume imagery through low-bandwidth links
NASA Astrophysics Data System (ADS)
Phillips, James; Grohs, Karl; Brower, Bernard; Kelly, Lawrence; Carlisle, Lewis; Pellechia, Matthew
2010-04-01
Providing current, time-sensitive imagery and geospatial information to deployed tactical military forces or first responders continues to be a challenge. This challenge is compounded through rapid increases in sensor collection volumes, both with larger arrays and higher temporal capture rates. Focusing on the needs of these military forces and first responders, ITT developed a system called AGILE (Advanced Geospatial Imagery Library Enterprise) Access as an innovative approach based on standard off-the-shelf techniques to solving this problem. The AGILE Access system is based on commercial software called Image Access Solutions (IAS) and incorporates standard JPEG 2000 processing. Our solution system is implemented in an accredited, deployable form, incorporating a suite of components, including an image database, a web-based search and discovery tool, and several software tools that act in concert to process, store, and disseminate imagery from airborne systems and commercial satellites. Currently, this solution is operational within the U.S. Government tactical infrastructure and supports disadvantaged imagery users in the field. This paper presents the features and benefits of this system to disadvantaged users as demonstrated in real-world operational environments.
Granmo, Ole-Christoffer; Oommen, B John; Myrer, Svein Arild; Olsen, Morten Goodwin
2007-02-01
This paper considers the nonlinear fractional knapsack problem and demonstrates how its solution can be effectively applied to two resource allocation problems dealing with the World Wide Web. The novel solution involves a "team" of deterministic learning automata (LA). The first real-life problem relates to resource allocation in web monitoring so as to "optimize" information discovery when the polling capacity is constrained. The disadvantages of the currently reported solutions are explained in this paper. The second problem concerns allocating limited sampling resources in a "real-time" manner with the purpose of estimating multiple binomial proportions. This is the scenario encountered when the user has to evaluate multiple web sites by accessing a limited number of web pages, and the proportions of interest are the fraction of each web site that is successfully validated by an HTML validator. Using the general LA paradigm to tackle both of the real-life problems, the proposed scheme improves a current solution in an online manner through a series of informed guesses that move toward the optimal solution. At the heart of the scheme, a team of deterministic LA performs a controlled random walk on a discretized solution space. Comprehensive experimental results demonstrate that the discretization resolution determines the precision of the scheme, and that for a given precision, the current solution (to both problems) is consistently improved until a nearly optimal solution is found--even for switching environments. Thus, the scheme, while being novel to the entire field of LA, also efficiently handles a class of resource allocation problems previously not addressed in the literature.
SAMuS: Service-Oriented Architecture for Multisensor Surveillance in Smart Homes
Van de Walle, Rik
2014-01-01
The design of a service-oriented architecture for multisensor surveillance in smart homes is presented as an integrated solution enabling automatic deployment, dynamic selection, and composition of sensors. Sensors are implemented as Web-connected devices, with a uniform Web API. RESTdesc is used to describe the sensors and a novel solution is presented to automatically compose Web APIs that can be applied with existing Semantic Web reasoners. We evaluated the solution by building a smart Kinect sensor that is able to dynamically switch between IR and RGB and optimizing person detection by incorporating feedback from pressure sensors, as such demonstrating the collaboration among sensors to enhance detection of complex events. The performance results show that the platform scales for many Web APIs as composition time remains limited to a few hundred milliseconds in almost all cases. PMID:24778579
Electronic Ramp to Success: Designing Campus Web Pages for Users with Disabilities.
ERIC Educational Resources Information Center
Coombs, Norman
2002-01-01
Discusses key issues in addressing the challenge of Web accessibility for people with disabilities, including tools for Web authoring, repairing, and accessibility validation, and relevant legal issues. Presents standards for Web accessibility, including the Section 508 Standards from the Federal Access Board, and the World Wide Web Consortium's…
Integrated web system of geospatial data services for climate research
NASA Astrophysics Data System (ADS)
Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander
2016-04-01
Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required. An approach for integrated analysis of georefernced climatological data sets based on combination of web and GIS technologies in the framework of spatial data infrastructure paradigm is presented. According to this approach a dedicated data-processing web system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is being developed. It is based on Open Geospatial Consortium (OGC) standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement #14.613.21.0037.
Lessons Learned Implementing DOORS in a Citrix Environment
NASA Technical Reports Server (NTRS)
Bussman, Marie
2005-01-01
NASA's James Web Space Telescope (JWST) Project is a large multi-national project with geographically dispersed contractors that all need access to the Projects requirement database. Initially, the project utilized multiple DOORS databases with the built-in partitions feature to exchange modules amongst the various contractor sites. As the requirements databases matured the use of partitions became extremely difficult. There have been many issues such as incompatible versions of DOORS, inefficient mechanism for sharing modules, security concerns, performance issues, and inconsistent document import and export formats. Deployment of the client software with limited IT resources available was also an issue. The solution chosen by JWST was to integrate the use of a Citrix environment with the DOORS database to address most of the project concerns. The use of the Citrix solution allowed a single Requirements database in a secure environment via a web interface. The Citrix environment allows JWST to upgrade to the most current version of DOORS without having to coordinate multiple sites and user upgrades. The single requirements database eliminates a multitude of Configuration Management concerns and facilitated the standardization of documentation formats. This paper discusses the obstacles and the lessons learned throughout the installation, implementation, usage and deployment process of a centralized DOORS database solution.
Discovery Mechanisms for the Sensor Web
Jirka, Simon; Bröring, Arne; Stasch, Christoph
2009-01-01
This paper addresses the discovery of sensors within the OGC Sensor Web Enablement framework. Whereas services like the OGC Web Map Service or Web Coverage Service are already well supported through catalogue services, the field of sensor networks and the according discovery mechanisms is still a challenge. The focus within this article will be on the use of existing OGC Sensor Web components for realizing a discovery solution. After discussing the requirements for a Sensor Web discovery mechanism, an approach will be presented that was developed within the EU funded project “OSIRIS”. This solution offers mechanisms to search for sensors, exploit basic semantic relationships, harvest sensor metadata and integrate sensor discovery into already existing catalogues. PMID:22574038
Code of Federal Regulations, 2011 CFR
2011-10-01
... standards, and resolve any related issues. (c) Based on those discussions, the Project Officer shall provide... communication must meet the accessibility standards in 36 CFR 1194.22, “Web-based intranet and Internet... standards for HHS Web site content and communications materials. 311.7001 Section 311.7001 Federal...
Code of Federal Regulations, 2013 CFR
2013-10-01
... standards, and resolve any related issues. (c) Based on those discussions, the Project Officer shall provide... communication must meet the accessibility standards in 36 CFR 1194.22, “Web-based intranet and Internet... standards for HHS Web site content and communications materials. 311.7001 Section 311.7001 Federal...
Code of Federal Regulations, 2012 CFR
2012-10-01
... standards, and resolve any related issues. (c) Based on those discussions, the Project Officer shall provide... communication must meet the accessibility standards in 36 CFR 1194.22, “Web-based intranet and Internet... standards for HHS Web site content and communications materials. 311.7001 Section 311.7001 Federal...
Code of Federal Regulations, 2014 CFR
2014-10-01
... standards, and resolve any related issues. (c) Based on those discussions, the Project Officer shall provide... communication must meet the accessibility standards in 36 CFR 1194.22, “Web-based intranet and Internet... standards for HHS Web site content and communications materials. 311.7001 Section 311.7001 Federal...
New Directions in the NOAO Observing Proposal System
NASA Astrophysics Data System (ADS)
Gasson, David; Bell, Dave
For the past eight years NOAO has been refining its on-line observing proposal system. Virtually all related processes are now handled electronically. Members of the astronomical community can submit proposals through email, web form, or via the Gemini Phase I Tool. NOAO staff can use the system to do administrative tasks, scheduling, and compilation of various statistics. In addition, all information relevant to the TAC process is made available on-line, including the proposals themselves (in HTML, PDF and PostScript) and technical comments. Grades and TAC comments are entered and edited through web forms, and can be sorted and filtered according to specified criteria. Current developments include a move away from proprietary solutions, toward open standards such as SQL (in the form of the MySQL relational database system), Perl, PHP and XML.
An Approach of Web-based Point Cloud Visualization without Plug-in
NASA Astrophysics Data System (ADS)
Ye, Mengxuan; Wei, Shuangfeng; Zhang, Dongmei
2016-11-01
With the advances in three-dimensional laser scanning technology, the demand for visualization of massive point cloud is increasingly urgent, but a few years ago point cloud visualization was limited to desktop-based solutions until the introduction of WebGL, several web renderers are available. This paper addressed the current issues in web-based point cloud visualization, and proposed a method of web-based point cloud visualization without plug-in. The method combines ASP.NET and WebGL technologies, using the spatial database PostgreSQL to store data and the open web technologies HTML5 and CSS3 to implement the user interface, a visualization system online for 3D point cloud is developed by Javascript with the web interactions. Finally, the method is applied to the real case. Experiment proves that the new model is of great practical value which avoids the shortcoming of the existing WebGIS solutions.
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.
Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J
2017-10-30
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application
Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.
2017-10-30
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less
Task 28: Web Accessible APIs in the Cloud Trade Study
NASA Technical Reports Server (NTRS)
Gallagher, James; Habermann, Ted; Jelenak, Aleksandar; Lee, Joe; Potter, Nathan; Yang, Muqun
2017-01-01
This study explored three candidate architectures for serving NASA Earth Science Hierarchical Data Format Version 5 (HDF5) data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the project are: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and Network Common Data Format Version 4 (netCDF4) data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3).Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.
Web Services--A Buzz Word with Potentials
János T. Füstös
2006-01-01
The simplest definition of a web service is an application that provides a web API. The web API exposes the functionality of the solution to other applications. The web API relies on other Internet-based technologies to manage communications. The resulting web services are pervasive, vendor-independent, language-neutral, and very low-cost. The main purpose of a web API...
Chong, Yap-Seng; Jiao, Nana; Luo, Nan
2018-01-01
Background In addition to recuperating from the physical and emotional demands of childbirth, first-time mothers are met with demands of adapting to their social roles while picking up new skills to take care of their newborn. Mothers may not feel adequately prepared for parenthood if they are situated in an unsupported environment. Postnatal psychoeducational interventions have been shown to be useful and can offer a cost-effective solution for improving maternal outcomes. Objective The objective of this study was to examine the effectiveness and cost-effectiveness of Web-based and home-based postnatal psychoeducational programs for first-time mothers on maternal outcomes. Methods A randomized controlled three-group pre- and posttests experimental design is proposed. This study plans to recruit 204 first-time mothers on their day of discharge from a public tertiary hospital in Singapore. Eligible first-time mothers will be randomly allocated to either a Web-based psychoeducation group, a home-based psychoeducation group, or a control group receiving standard care. The outcomes include maternal parental self-efficacy, social support, psychological well-being (anxiety and postnatal depression), and cost evaluation. Data will be collected at baseline, 1 month, 3 months, and 6 months post-delivery. Results The recruitment (n=204) commenced in October 2016 and was completed in February 2017, with 68 mothers in each group. The 6-month follow-up data collection was completed in August 2017. Conclusions This study may identify an effective and cost-effective Web-based postnatal psychoeducational program to improve first-time mothers’ health outcomes. The provision of a widely-accessed Web-based postnatal psychoeducational program will eventually lead to more positive postnatal experiences for first-time mothers and positively influence their future birth plans. Trial Registration International Standard Randomized Controlled Trial Number (ISRCTN): 45202278; http://www.isrctn.com/ISRCTN45202278 (Archived by WebCite at http://www.webcitation.org/6whx0pQ2F). PMID:29386175
Konstantinidis, Evdokimos I; Bamparopoulos, Giorgos; Bamidis, Panagiotis D
2017-05-01
Exergames have been the subject of research and technology innovations for a number of years. Different devices and technologies have been utilized to train the body and the mind of senior people or different patient groups. In the past, we presented FitForAll, the protocol efficacy of which was proven through widely taken (controlled) pilots with more than 116 seniors for a period of two months. The current piece of work expands this and presents the first truly web exergaming platform, which is solely based on HTML5 and JavaScript without any browser plugin requirements. The adopted architecture (controller application communication framework) combines a unified solution for input devices such as MS Kinect and Wii Balance Βoard which may seamlessly be exploited through standard physical exercise protocols (American College of Sports Medicine guidelines) and accommodate high detail logging; this allows for proper pilot testing and usability evaluations in ecologically valid Living Lab environments. The latter type of setups is also used herein for evaluating the web application with more than a dozen of real elderly users following quantitative approaches.
Find resources and guidance on writing for the web, keeping your content relevant, using social media, meeting accessibility standards, and how to transform your content into the WebCMS to meet One EPA Web standards.
Sharma, Deepak K; Solbrig, Harold R; Tao, Cui; Weng, Chunhua; Chute, Christopher G; Jiang, Guoqian
2017-06-05
Detailed Clinical Models (DCMs) have been regarded as the basis for retaining computable meaning when data are exchanged between heterogeneous computer systems. To better support clinical cancer data capturing and reporting, there is an emerging need to develop informatics solutions for standards-based clinical models in cancer study domains. The objective of the study is to develop and evaluate a cancer genome study metadata management system that serves as a key infrastructure in supporting clinical information modeling in cancer genome study domains. We leveraged a Semantic Web-based metadata repository enhanced with both ISO11179 metadata standard and Clinical Information Modeling Initiative (CIMI) Reference Model. We used the common data elements (CDEs) defined in The Cancer Genome Atlas (TCGA) data dictionary, and extracted the metadata of the CDEs using the NCI Cancer Data Standards Repository (caDSR) CDE dataset rendered in the Resource Description Framework (RDF). The ITEM/ITEM_GROUP pattern defined in the latest CIMI Reference Model is used to represent reusable model elements (mini-Archetypes). We produced a metadata repository with 38 clinical cancer genome study domains, comprising a rich collection of mini-Archetype pattern instances. We performed a case study of the domain "clinical pharmaceutical" in the TCGA data dictionary and demonstrated enriched data elements in the metadata repository are very useful in support of building detailed clinical models. Our informatics approach leveraging Semantic Web technologies provides an effective way to build a CIMI-compliant metadata repository that would facilitate the detailed clinical modeling to support use cases beyond TCGA in clinical cancer study domains.
Availability of the OGC geoprocessing standard: March 2011 reality check
NASA Astrophysics Data System (ADS)
Lopez-Pellicer, Francisco J.; Rentería-Agualimpia, Walter; Béjar, Rubén; Muro-Medrano, Pedro R.; Zarazaga-Soria, F. Javier
2012-10-01
This paper presents an investigation about the servers available in March 2011 conforming to the Web Processing Service interface specification published by the geospatial standards organization Open Geospatial Consortium (OGC) in 2007. This interface specification gives support to standard Web-based geoprocessing. The data used in this research were collected using a focused crawler configured for finding OGC Web services. The research goals are (i) to provide a reality check of the availability of Web Processing Service servers, (ii) to provide quantitative data about the use of different features defined in the standard that are relevant for a scalable Geoprocessing Web (e.g. long-running processes, Web-accessible data outputs), and (iii) to test if the advances in the use of search engines and focused crawlers for finding Web services can be applied for finding geoscience processing systems. Research results show the feasibility of the discovery approach and provide data about the implementation of the Web Processing Service specification. These results also show extensive use of features related to scalability, except for those related to technical and semantic interoperability.
A simple method for serving Web hypermaps with dynamic database drill-down
Boulos, Maged N Kamel; Roudsari, Abdul V; Carson, Ewart R
2002-01-01
Background HealthCyberMap aims at mapping parts of health information cyberspace in novel ways to deliver a semantically superior user experience. This is achieved through "intelligent" categorisation and interactive hypermedia visualisation of health resources using metadata, clinical codes and GIS. HealthCyberMap is an ArcView 3.1 project. WebView, the Internet extension to ArcView, publishes HealthCyberMap ArcView Views as Web client-side imagemaps. The basic WebView set-up does not support any GIS database connection, and published Web maps become disconnected from the original project. A dedicated Internet map server would be the best way to serve HealthCyberMap database-driven interactive Web maps, but is an expensive and complex solution to acquire, run and maintain. This paper describes HealthCyberMap simple, low-cost method for "patching" WebView to serve hypermaps with dynamic database drill-down functionality on the Web. Results The proposed solution is currently used for publishing HealthCyberMap GIS-generated navigational information maps on the Web while maintaining their links with the underlying resource metadata base. Conclusion The authors believe their map serving approach as adopted in HealthCyberMap has been very successful, especially in cases when only map attribute data change without a corresponding effect on map appearance. It should be also possible to use the same solution to publish other interactive GIS-driven maps on the Web, e.g., maps of real world health problems. PMID:12437788
Connecting geoscience systems and data using Linked Open Data in the Web of Data
NASA Astrophysics Data System (ADS)
Ritschel, Bernd; Neher, Günther; Iyemori, Toshihiko; Koyama, Yukinobu; Yatagai, Akiyo; Murayama, Yasuhiro; Galkin, Ivan; King, Todd; Fung, Shing F.; Hughes, Steve; Habermann, Ted; Hapgood, Mike; Belehaki, Anna
2014-05-01
Linked Data or Linked Open Data (LOD) in the realm of free and publically accessible data is one of the most promising and most used semantic Web frameworks connecting various types of data and vocabularies including geoscience and related domains. The semantic Web extension to the commonly existing and used World Wide Web is based on the meaning of entities and relationships or in different words classes and properties used for data in a global data and information space, the Web of Data. LOD data is referenced and mash-uped by URIs and is retrievable using simple parameter controlled HTTP-requests leading to a result which is human-understandable or machine-readable. Furthermore the publishing and mash-up of data in the semantic Web realm is realized by specific Web standards, such as RDF, RDFS, OWL and SPARQL defined for the Web of Data. Semantic Web based mash-up is the Web method to aggregate and reuse various contents from different sources, such as e.g. using FOAF as a model and vocabulary for the description of persons and organizations -in our case- related to geoscience projects, instruments, observations, data and so on. On the example of three different geoscience data and information management systems, such as ESPAS, IUGONET and GFZ ISDC and the associated science data and related metadata or better called context data, the concept of the mash-up of systems and data using the semantic Web approach and the Linked Open Data framework is described in this publication. Because the three systems are based on different data models, data storage structures and technical implementations an extra semantic Web layer upon the existing interfaces is used for mash-up solutions. In order to satisfy the semantic Web standards, data transition processes, such as the transfer of content stored in relational databases or mapped in XML documents into SPARQL capable databases or endpoints using D2R or XSLT is necessary. In addition, the use of mapped and/or merged domain specific and cross-domain vocabularies in the sense of terminological ontologies are the foundation for a virtually unified data retrieval and access in IUGONET, ESPAS and GFZ ISDC data management systems. SPARQL endpoints realized either by originally RDF databases, e.g. Virtuoso or by virtual SPARQL endpoints, e.g. D2R services enable an only upon Web standard-based mash-up of domain-specific systems and data, such as in this case the space weather and geomagnetic domain but also cross-domain connection to data and vocabularies, e.g. related to NASA's VxOs, particularly VWO or NASA's PDS data system within LOD. LOD - Linked Open Data RDF - Resource Description Framework RDFS - RDF Schema OWL - Ontology Web Language SPARQL - SPARQL Protocol and RDF Query Language FOAF - Friends of a Friend ontology ESPAS - Near Earth Space Data Infrastructure for e-Science (Project) IUGONET - Inter-university Upper Atmosphere Global Observation Network (Project) GFZ ISDC - German Research Centre for Geosciences Information System and Data Center XML - Extensible Mark-up Language D2R - (Relational) Database to RDF (Transformation) XSLT - Extensible Stylesheet Language Transformation Virtuoso - OpenLink Virtuoso Universal Server (including RDF data management) NASA - National Aeronautics and Space Administration VOx - Virtual Observatories VWO - Virtual Wave Observatory PDS - Planetary Data System
NASA Astrophysics Data System (ADS)
Walker, J. I.; Blodgett, D. L.; Suftin, I.; Kunicki, T.
2013-12-01
High-resolution data for use in environmental modeling is increasingly becoming available at broad spatial and temporal scales. Downscaled climate projections, remotely sensed landscape parameters, and land-use/land-cover projections are examples of datasets that may exceed an individual investigation's data management and analysis capacity. To allow projects on limited budgets to work with many of these data sets, the burden of working with them must be reduced. The approach being pursued at the U.S. Geological Survey Center for Integrated Data Analytics uses standard self-describing web services that allow machine to machine data access and manipulation. These techniques have been implemented and deployed in production level server-based Web Processing Services that can be accessed from a web application or scripted workflow. Data publication techniques that allow machine-interpretation of large collections of data have also been implemented for numerous datasets at U.S. Geological Survey data centers as well as partner agencies and academic institutions. Discovery of data services is accomplished using a method in which a machine-generated metadata record holds content--derived from the data's source web service--that is intended for human interpretation as well as machine interpretation. A distributed search application has been developed that demonstrates the utility of a decentralized search of data-owner metadata catalogs from multiple agencies. The integrated but decentralized system of metadata, data, and server-based processing capabilities will be presented. The design, utility, and value of these solutions will be illustrated with applied science examples and success stories. Datasets such as the EPA's Integrated Climate and Land Use Scenarios, USGS/NASA MODIS derived land cover attributes, and downscaled climate projections from several sources are examples of data this system includes. These and other datasets, have been published as standard, self-describing, web services that provide the ability to inspect and subset the data. This presentation will demonstrate this file-to-web service concept and how it can be used from script-based workflows or web applications.
Wagener, Johannes; Spjuth, Ola; Willighagen, Egon L; Wikberg, Jarl ES
2009-01-01
Background Life sciences make heavily use of the web for both data provision and analysis. However, the increasing amount of available data and the diversity of analysis tools call for machine accessible interfaces in order to be effective. HTTP-based Web service technologies, like the Simple Object Access Protocol (SOAP) and REpresentational State Transfer (REST) services, are today the most common technologies for this in bioinformatics. However, these methods have severe drawbacks, including lack of discoverability, and the inability for services to send status notifications. Several complementary workarounds have been proposed, but the results are ad-hoc solutions of varying quality that can be difficult to use. Results We present a novel approach based on the open standard Extensible Messaging and Presence Protocol (XMPP), consisting of an extension (IO Data) to comprise discovery, asynchronous invocation, and definition of data types in the service. That XMPP cloud services are capable of asynchronous communication implies that clients do not have to poll repetitively for status, but the service sends the results back to the client upon completion. Implementations for Bioclipse and Taverna are presented, as are various XMPP cloud services in bio- and cheminformatics. Conclusion XMPP with its extensions is a powerful protocol for cloud services that demonstrate several advantages over traditional HTTP-based Web services: 1) services are discoverable without the need of an external registry, 2) asynchronous invocation eliminates the need for ad-hoc solutions like polling, and 3) input and output types defined in the service allows for generation of clients on the fly without the need of an external semantics description. The many advantages over existing technologies make XMPP a highly interesting candidate for next generation online services in bioinformatics. PMID:19732427
Wagener, Johannes; Spjuth, Ola; Willighagen, Egon L; Wikberg, Jarl E S
2009-09-04
Life sciences make heavily use of the web for both data provision and analysis. However, the increasing amount of available data and the diversity of analysis tools call for machine accessible interfaces in order to be effective. HTTP-based Web service technologies, like the Simple Object Access Protocol (SOAP) and REpresentational State Transfer (REST) services, are today the most common technologies for this in bioinformatics. However, these methods have severe drawbacks, including lack of discoverability, and the inability for services to send status notifications. Several complementary workarounds have been proposed, but the results are ad-hoc solutions of varying quality that can be difficult to use. We present a novel approach based on the open standard Extensible Messaging and Presence Protocol (XMPP), consisting of an extension (IO Data) to comprise discovery, asynchronous invocation, and definition of data types in the service. That XMPP cloud services are capable of asynchronous communication implies that clients do not have to poll repetitively for status, but the service sends the results back to the client upon completion. Implementations for Bioclipse and Taverna are presented, as are various XMPP cloud services in bio- and cheminformatics. XMPP with its extensions is a powerful protocol for cloud services that demonstrate several advantages over traditional HTTP-based Web services: 1) services are discoverable without the need of an external registry, 2) asynchronous invocation eliminates the need for ad-hoc solutions like polling, and 3) input and output types defined in the service allows for generation of clients on the fly without the need of an external semantics description. The many advantages over existing technologies make XMPP a highly interesting candidate for next generation online services in bioinformatics.
40 CFR 63.3321 - What operating limits must I meet?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Standards for Hazardous Air Pollutants: Paper and Other Web Coating Emission Standards and Compliance Dates § 63.3321 What operating limits must I meet? (a) For any web coating line or group of web coating lines...
Reinforcement Learning Based Web Service Compositions for Mobile Business
NASA Astrophysics Data System (ADS)
Zhou, Juan; Chen, Shouming
In this paper, we propose a new solution to Reactive Web Service Composition, via molding with Reinforcement Learning, and introducing modified (alterable) QoS variables into the model as elements in the Markov Decision Process tuple. Moreover, we give an example of Reactive-WSC-based mobile banking, to demonstrate the intrinsic capability of the solution in question of obtaining the optimized service composition, characterized by (alterable) target QoS variable sets with optimized values. Consequently, we come to the conclusion that the solution has decent potentials in boosting customer experiences and qualities of services in Web Services, and those in applications in the whole electronic commerce and business sector.
ERIC Educational Resources Information Center
Villano, Matt
2009-01-01
In this article, the author discusses how several institutions are turning to popular technologies to streamline and facilitate fundraising efforts. These popular solutions include social networking websites and other Web 2.0 tools, e-mail marketing products, and a new look at enterprise-level solutions. At Monmouth College, web professionals…
Polilli, Ennio; Sozio, Federica; Di Stefano, Paola; Clerico, Luigi; Di Iorio, Giancarlo; Parruti, Giustino
2018-04-01
This study aimed to analyze the efficacy of a Web-based testing programme in terms of the prevention of late HIV presentation. The clinical characteristics of patients diagnosed with HIV via the Web-based testing programme were compared to those of patients diagnosed in parallel via standard diagnostic care procedures. This study included the clinical and demographic data of newly diagnosed HIV patients enrolled at the study clinic between February 2014 and June 2017. These patients were diagnosed either via standard diagnostic procedures or as a result of the Web-based testing programme. Eighty-eight new cases of HIV were consecutively enrolled; their mean age was 39.1±13.0 years. Fifty-nine patients (67%) were diagnosed through standard diagnostic procedures and 29 (33%) patients came from the Web-based testing programme. Late presentation (62% vs. 34%, p=0.01) and AIDS-defining conditions at presentation (13 vs. 1, p=0.02) were significantly more frequent in the standard care group than in the Web-based group; four of 13 patients with AIDS diagnosed under standard diagnostic procedures died, versus none in the Web-based testing group (p<0.001). Web-based recruitment for voluntary and free HIV testing helped to diagnose patients with less advanced HIV disease and no risk of death, from all at-risk groups, in comparison with standard care testing. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
The use of geospatial web services for exchanging utilities data
NASA Astrophysics Data System (ADS)
Kuczyńska, Joanna
2013-04-01
Geographic information technologies and related geo-information systems currently play an important role in the management of public administration in Poland. One of these tasks is to maintain and update Geodetic Evidence of Public Utilities (GESUT), part of the National Geodetic and Cartographic Resource, which contains an important for many institutions information of technical infrastructure. It requires an active exchange of data between the Geodesy and Cartography Documentation Centers and institutions, which administrate transmission lines. The administrator of public utilities, is legally obliged to provide information about utilities to GESUT. The aim of the research work was to develop a universal data exchange methodology, which can be implemented on a variety of hardware and software platforms. This methodology use Unified Modeling Language (UML), eXtensible Markup Language (XML), and Geography Markup Language (GML). The proposed methodology is based on the two different strategies: Model Driven Architecture (MDA) and Service Oriented Architecture (SOA). Used solutions are consistent with the INSPIRE Directive and ISO 19100 series standards for geographic information. On the basis of analysis of the input data structures, conceptual models were built for both databases. Models were written in the universal modeling language: UML. Combined model that defines a common data structure was also built. This model was transformed into developed for the exchange of geographic information GML standard. The structure of the document describing the data that may be exchanged is defined in the .xsd file. Network services were selected and implemented in the system designed for data exchange based on open source tools. Methodology was implemented and tested. Data in the agreed data structure and metadata were set up on the server. Data access was provided by geospatial network services: data searching possibilities by Catalog Service for the Web (CSW), data collection by Web Feature Service (WFS). WFS provides also operation for modification data, for example to update them by utility administrator. The proposed solution significantly increases the efficiency of data exchange and facilitates maintenance the National Geodetic and Cartographic Resource.
Rational analyses of information foraging on the web.
Pirolli, Peter
2005-05-06
This article describes rational analyses and cognitive models of Web users developed within information foraging theory. This is done by following the rational analysis methodology of (a) characterizing the problems posed by the environment, (b) developing rational analyses of behavioral solutions to those problems, and (c) developing cognitive models that approach the realization of those solutions. Navigation choice is modeled as a random utility model that uses spreading activation mechanisms that link proximal cues (information scent) that occur in Web browsers to internal user goals. Web-site leaving is modeled as an ongoing assessment by the Web user of the expected benefits of continuing at a Web site as opposed to going elsewhere. These cost-benefit assessments are also based on spreading activation models of information scent. Evaluations include a computational model of Web user behavior called Scent-Based Navigation and Information Foraging in the ACT Architecture, and the Law of Surfing, which characterizes the empirical distribution of the length of paths of visitors at a Web site. 2005 Lawrence Erlbaum Associates, Inc.
BioSWR – Semantic Web Services Registry for Bioinformatics
Repchevsky, Dmitry; Gelpi, Josep Ll.
2014-01-01
Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license. PMID:25233118
BioSWR--semantic web services registry for bioinformatics.
Repchevsky, Dmitry; Gelpi, Josep Ll
2014-01-01
Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license.
Russian Culture and Soviet Science.
1984-03-01
scientist the nature of the puzzles, but also provide him with a framework for their solutions by generating a web of expectations. The power of this...provide the scientist with both t’- puzzles and some expectation of the nature of their solutions . The web of expectations serves as a frame of...student to a paradigm 0 method of problem solution , the scientific community ensures the continuation of a strong consensus. After admitting new
Securing a web-based teleradiology platform according to German law and "best practices".
Spitzer, Michael; Ullrich, Tobias; Ueckert, Frank
2009-01-01
The Medical Data and Picture Exchange platform (MDPE), as a teleradiology system, facilitates the exchange of digital medical imaging data among authorized users. It features extensive support of the DICOM standard including networking functions. Since MDPE is designed as a web service, security and confidentiality of data and communication pose an outstanding challenge. To comply with demands of German laws and authorities, a generic data security concept considered as "best practice" in German health telematics was adapted to the specific demands of MDPE. The concept features strict logical and physical separation of diagnostic and identity data and thus an all-encompassing pseudonymization throughout the system. Hence, data may only be merged at authorized clients. MDPE's solution of merging data from separate sources within a web browser avoids technically questionable techniques such as deliberate cross-site scripting. Instead, data is merged dynamically by JavaScriptlets running in the user's browser. These scriptlets are provided by one server, while content and method calls are generated by another server. Additionally, MDPE uses encrypted temporary IDs for communication and merging of data.
NASA Astrophysics Data System (ADS)
Ferraris, M.; Risso, P.; Squarcia, S.
We present the realization done for the organization, selection, transmission of Radiotherapy's data and images. The choice of a standard healthcare records, based on the stereotactic and/or conformational radiotherapy, the implementation of the healthcare file into a distributed data-base using the World Wide Web platform for data presentation and transmission and the availability in the network is presented. The solution chosen is a good example of technology transfert from High Energy physics and Medicine and opens new interesting ways in this field.
The Role of Computers in Research and Development at Langley Research Center
NASA Technical Reports Server (NTRS)
Wieseman, Carol D. (Compiler)
1994-01-01
This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.
ERIC Educational Resources Information Center
Bordeianu, Sever; Carter, Christina E.; Dennis, Nancy K.
2000-01-01
Describes Web-based online public access catalogs (Web OPACs) and other Web-based tools as gateway methods for providing access to library collections. Addresses solutions for overcoming barriers to information, such as through the implementation of proxy servers and other authentication tools for remote users. (Contains 18 references.)…
XML — an opportunity for
NASA Astrophysics Data System (ADS)
Houlding, Simon W.
2001-08-01
Extensible markup language (XML) is a recently introduced meta-language standard on the Web. It provides the rules for development of metadata (markup) standards for information transfer in specific fields. XML allows development of markup languages that describe what information is rather than how it should be presented. This allows computer applications to process the information in intelligent ways. In contrast hypertext markup language (HTML), which fuelled the initial growth of the Web, is a metadata standard concerned exclusively with presentation of information. Besides its potential for revolutionizing Web activities, XML provides an opportunity for development of meaningful data standards in specific application fields. The rapid endorsement of XML by science, industry and e-commerce has already spawned new metadata standards in such fields as mathematics, chemistry, astronomy, multi-media and Web micro-payments. Development of XML-based data standards in the geosciences would significantly reduce the effort currently wasted on manipulating and reformatting data between different computer platforms and applications and would ensure compatibility with the new generation of Web browsers. This paper explores the evolution, benefits and status of XML and related standards in the more general context of Web activities and uses this as a platform for discussion of its potential for development of data standards in the geosciences. Some of the advantages of XML are illustrated by a simple, browser-compatible demonstration of XML functionality applied to a borehole log dataset. The XML dataset and the associated stylesheet and schema declarations are available for FTP download.
The semantic web in translational medicine: current applications and future directions
Machado, Catia M.; Rebholz-Schuhmann, Dietrich; Freitas, Ana T.; Couto, Francisco M.
2015-01-01
Semantic web technologies offer an approach to data integration and sharing, even for resources developed independently or broadly distributed across the web. This approach is particularly suitable for scientific domains that profit from large amounts of data that reside in the public domain and that have to be exploited in combination. Translational medicine is such a domain, which in addition has to integrate private data from the clinical domain with proprietary data from the pharmaceutical domain. In this survey, we present the results of our analysis of translational medicine solutions that follow a semantic web approach. We assessed these solutions in terms of their target medical use case; the resources covered to achieve their objectives; and their use of existing semantic web resources for the purposes of data sharing, data interoperability and knowledge discovery. The semantic web technologies seem to fulfill their role in facilitating the integration and exploration of data from disparate sources, but it is also clear that simply using them is not enough. It is fundamental to reuse resources, to define mappings between resources, to share data and knowledge. All these aspects allow the instantiation of translational medicine at the semantic web-scale, thus resulting in a network of solutions that can share resources for a faster transfer of new scientific results into the clinical practice. The envisioned network of translational medicine solutions is on its way, but it still requires resolving the challenges of sharing protected data and of integrating semantic-driven technologies into the clinical practice. PMID:24197933
The semantic web in translational medicine: current applications and future directions.
Machado, Catia M; Rebholz-Schuhmann, Dietrich; Freitas, Ana T; Couto, Francisco M
2015-01-01
Semantic web technologies offer an approach to data integration and sharing, even for resources developed independently or broadly distributed across the web. This approach is particularly suitable for scientific domains that profit from large amounts of data that reside in the public domain and that have to be exploited in combination. Translational medicine is such a domain, which in addition has to integrate private data from the clinical domain with proprietary data from the pharmaceutical domain. In this survey, we present the results of our analysis of translational medicine solutions that follow a semantic web approach. We assessed these solutions in terms of their target medical use case; the resources covered to achieve their objectives; and their use of existing semantic web resources for the purposes of data sharing, data interoperability and knowledge discovery. The semantic web technologies seem to fulfill their role in facilitating the integration and exploration of data from disparate sources, but it is also clear that simply using them is not enough. It is fundamental to reuse resources, to define mappings between resources, to share data and knowledge. All these aspects allow the instantiation of translational medicine at the semantic web-scale, thus resulting in a network of solutions that can share resources for a faster transfer of new scientific results into the clinical practice. The envisioned network of translational medicine solutions is on its way, but it still requires resolving the challenges of sharing protected data and of integrating semantic-driven technologies into the clinical practice. © The Author 2013. Published by Oxford University Press.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-28
...-AP76 Oil and Natural Gas Sector: New Source Performance Standards and National Emission Standards for... and Natural Gas Sector: New Source Performance Standards and National Emission Standards for Hazardous... be charged for copying. World Wide Web. The EPA Web site for this rulemaking is located at: http...
Content and Workflow Management for Library Websites: Case Studies
ERIC Educational Resources Information Center
Yu, Holly, Ed.
2005-01-01
Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…
WEB-BASED MODELING OF A FERTILIZER SOLUTION SPILL IN THE OHIO RIVER
Environmental computer models are usually desktop models. Some web-enabled models are beginning to appear where the user can use a browser to run the models on a central web server. Several issues arise when a desktop model is transferred to a web architecture. This paper discuss...
Reviewing innovative Earth observation solutions for filling science-policy gaps in hydrology
NASA Astrophysics Data System (ADS)
Lehmann, Anthony; Giuliani, Gregory; Ray, Nicolas; Rahman, Kazi; Abbaspour, Karim C.; Nativi, Stefano; Craglia, Massimo; Cripe, Douglas; Quevauviller, Philippe; Beniston, Martin
2014-10-01
Improved data sharing is needed for hydrological modeling and water management that require better integration of data, information and models. Technological advances in Earth observation and Web technologies have allowed the development of Spatial Data Infrastructures (SDIs) for improved data sharing at various scales. International initiatives catalyze data sharing by promoting interoperability standards to maximize the use of data and by supporting easy access to and utilization of geospatial data. A series of recent European projects are contributing to the promotion of innovative Earth observation solutions and the uptake of scientific outcomes in policy. Several success stories involving different hydrologists' communities can be reported around the World. Gaps still exist in hydrological, agricultural, meteorological and climatological data access because of various issues. While many sources of data exists at all scales it remains difficult and time-consuming to assemble hydrological information for most projects. Furthermore, data and sharing formats remain very heterogeneous. Improvements require implementing/endorsing some commonly agreed standards and documenting data with adequate metadata. The brokering approach allows binding heterogeneous resources published by different data providers and adapting them to tools and interfaces commonly used by consumers of these resources. The challenge is to provide decision-makers with reliable information, based on integrated data and tools derived from both Earth observations and scientific models. Successful SDIs rely therefore on various aspects: a shared vision between all participants, necessity to solve a common problem, adequate data policies, incentives, and sufficient resources. New data streams from remote sensing or crowd sourcing are also producing valuable information to improve our understanding of the water cycle, while field sensors are developing rapidly and becoming less costly. More recent data standards are enhancing interoperability between hydrology and other scientific disciplines, while solutions exist to communicate uncertainty of data and models, which is an essential pre-requisite for decision-making. Distributed computing infrastructures can handle complex and large hydrological data and models, while Web Processing Services bring the flexibility to develop and execute simple to complex workflows over the Internet. The need for capacity building at human, infrastructure and institutional levels is also a major driver for reinforcing the commitment to SDI concepts.
Sensor Web Interoperability Testbed Results Incorporating Earth Observation Satellites
NASA Technical Reports Server (NTRS)
Frye, Stuart; Mandl, Daniel J.; Alameh, Nadine; Bambacus, Myra; Cappelaere, Pat; Falke, Stefan; Derezinski, Linda; Zhao, Piesheng
2007-01-01
This paper describes an Earth Observation Sensor Web scenario based on the Open Geospatial Consortium s Sensor Web Enablement and Web Services interoperability standards. The scenario demonstrates the application of standards in describing, discovering, accessing and tasking satellites and groundbased sensor installations in a sequence of analysis activities that deliver information required by decision makers in response to national, regional or local emergencies.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-12
... corresponding accessible pages on a mobile Web site by one year after the final rule's effective date; and (3... Mobile Web site conformant with any of the following standards: WCAG 1.0, WCAG 2.0 at Level A, existing Section 508 standards, or Mobile Web Best Practices (MWBP) 1.0 (if applicable). Two of the options they...
OneGeology Web Services and Portal as a global geological SDI - latest standards and technology
NASA Astrophysics Data System (ADS)
Duffy, Tim; Tellez-Arenas, Agnes
2014-05-01
The global coverage of OneGeology Web Services (www.onegeology.org and portal.onegeology.org) achieved since 2007 from the 120 participating geological surveys will be reviewed and issues arising discussed. Recent enhancements to the OneGeology Web Services capabilities will be covered including new up to 5 star service accreditation scheme utilising the ISO/OGC Web Mapping Service standard version 1.3, core ISO 19115 metadata additions and Version 2.0 Web Feature Services (WFS) serving the new IUGS-CGI GeoSciML V3.2 geological web data exchange language standard (http://www.geosciml.org/) with its associated 30+ IUGS-CGI available vocabularies (http://resource.geosciml.org/ and http://srvgeosciml.brgm.fr/eXist2010/brgm/client.html). Use of the CGI simpelithology and timescale dictionaries now allow those who wish to do so to offer data harmonisation to query their GeoSciML 3.2 based Web Feature Services and their GeoSciML_Portrayal V2.0.1 (http://www.geosciml.org/) Web Map Services in the OneGeology portal (http://portal.onegeology.org). Contributing to OneGeology involves offering to serve ideally 1:1000,000 scale geological data (in practice any scale now is warmly welcomed) as an OGC (Open Geospatial Consortium) standard based WMS (Web Mapping Service) service from an available WWW server. This may either be hosted within the Geological Survey or a neighbouring, regional or elsewhere institution that offers to serve that data for them i.e. offers to help technically by providing the web serving IT infrastructure as a 'buddy'. OneGeology is a standards focussed Spatial Data Infrastructure (SDI) and works to ensure that these standards work together and it is now possible for European Geological Surveys to register their INSPIRE web services within the OneGeology SDI (e.g. see http://www.geosciml.org/geosciml/3.2/documentation/cookbook/INSPIRE_GeoSciML_Cookbook%20_1.0.pdf). The Onegeology portal (http://portal.onegeology.org) is the first port of call for anyone wishing to discover the availability of global geological web services and has new functionality to view and use such services including multiple projection support. KEYWORDS : OneGeology; GeoSciML V 3.2; Data exchange; Portal; INSPIRE; Standards; OGC; Interoperability; GeoScience information; WMS; WFS; Cookbook.
Intercomparison of different operational oceanographic forecast products in the CMEMS IBI area
NASA Astrophysics Data System (ADS)
Lorente, Pablo; Sotillo, Marcos G.; Dabrowski, Tomasz; Amo-Baladrón, Arancha; Aznar, Roland; De Pascual, Alvaro; Levier, Bruno; Bowyer, Peter; Cossarini, Gianpiero; Salon, Stefano; Tonani, Marina; Alvarez-Fanjul, Enrique
2017-04-01
The development of skill assessment software packages and dedicated web applications is a relatively novel theme in operational oceanography. Within the CMEMS IBI-MFC, the quality of IBI (Iberia-Biscay-Ireland) forecast products is assessed by means of NARVAL (North Atlantic Regional VALidation) web-based tool. The validation of IBI against independent in situ and remote-sensing measurements is routinely conducted to evaluate model's veracity and prognostic capabilities. Noticeable efforts are in progress to define meaningful skill scores and statistical metrics to quantitatively assess the quality and reliability of the IBI model solution. Likewise, the IBI-MFC compares the IBI forecast products with other model solutions by setting up specific intercomparison exercises on overlapping areas at diverse timescales. In this context, NARVAL web tool already includes a specific module to evaluate strengths and weaknesses of IBI versus other CMEMS operational ocean forecasting systems (OOFSs). In particular, the IBI physical ocean solution is compared against the CMEMS MED and NWS OOFSs. These CMEMS regional services delivered for the Mediterranean and the North West Shelves include data assimilation schemes in their respective operational chains and generate analogous ocean forecast products to the IBI ones. A number of physical parameters (i.e. sea surface temperature, salinity and current velocities) are evaluated through NARVAL on a daily basis in the overlapping areas existing between these three regional systems. NARVAL is currently being updated in order to extend this intercomparison of ocean model parameters to the biogeochemical solutions provided by the aforementioned OOFSs. More specifically, the simulated chlorophyll concentration is evaluated over several subregions of particular concern by using as benchmark the CMEMS satellite-derived observational products. In addition to this IBI comparison against other regional CMEMS products on overlapping areas, a specific intercomparison between the CMEMS GLOBAL solution and the IBI (regional application dynamically embedded in the former) is conducted in order to check its consistency and ability to outperform the parent model solution. Particular emphasis is placed on the comparison of time-series at specified locations (class-2 metrics). The standardized validation methodology presented here is particularly useful and could encompass the intercomparison of the regional application (IBI) and other nested higher resolution models at coastal/shelf scales to quantify the added value of downscaling in local downstream approaches.
Turning Interoperability Operational with GST
NASA Astrophysics Data System (ADS)
Schaeben, Helmut; Gabriel, Paul; Gietzel, Jan; Le, Hai Ha
2013-04-01
GST - Geosciences in space and time is being developed and implemented as hub to facilitate the exchange of spatially and temporally indexed multi-dimensional geoscience data and corresponding geomodels amongst partners. It originates from TUBAF's contribution to the EU project "ProMine" and its perspective extensions are TUBAF's contribution to the actual EU project "GeoMol". As of today, it provides basic components of a geodata infrastructure as required to establish interoperability with respect to geosciences. Generally, interoperability means the facilitation of cross-border and cross-sector information exchange, taking into account legal, organisational, semantic and technical aspects, cf. Interoperability Solutions for European Public Administrations (ISA), cf. http://ec.europa.eu/isa/. Practical interoperability for partners of a joint geoscience project, say European Geological Surveys acting in a border region, means in particular provision of IT technology to exchange spatially and maybe additionally temporally indexed multi-dimensional geoscience data and corresponding models, i.e. the objects composing geomodels capturing the geometry, topology, and various geoscience contents. Geodata Infrastructure (GDI) and interoperability are objectives of several inititatives, e.g. INSPIRE, OneGeology-Europe, and most recently EGDI-SCOPE to name just the most prominent ones. Then there are quite a few markup languages (ML) related to geographical or geological information like GeoSciML, EarthResourceML, BoreholeML, ResqML for reservoir characterization, earth and reservoir models, and many others featuring geoscience information. Several Web Services are focused on geographical or geoscience information. The Open Geospatial Consortium (OGC) promotes specifications of a Web Feature Service (WFS), a Web Map Service (WMS), a Web Coverage Serverice (WCS), a Web 3D Service (W3DS), and many more. It will be clarified how GST is related to these initiatives, especially how it complies with existing or developing standards or quasi-standards and how it applies and extents services towards interoperability in the Earth sciences.
40 CFR 63.3280 - What is in this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
...) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating What This Subpart... emissions of organic hazardous air pollutants (HAP) from paper and other web coating operations. This subpart establishes emission standards for web coating lines and specifies what you must do to comply if...
40 CFR 63.3280 - What is in this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
...) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating What This Subpart... emissions of organic hazardous air pollutants (HAP) from paper and other web coating operations. This subpart establishes emission standards for web coating lines and specifies what you must do to comply if...
40 CFR 63.3280 - What is in this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
...) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating What This Subpart... emissions of organic hazardous air pollutants (HAP) from paper and other web coating operations. This subpart establishes emission standards for web coating lines and specifies what you must do to comply if...
Acute microcirculatory response to nicotine in frog web.
Horimoto, M; Koyama, T
1982-01-01
Acute effects of nicotine (NC) on the microcirculation of frog webs were studied by measuring the blood flow velocity in arterioles, and by determining the diameter of both arterioles and venules. Simultaneous recordings of the ventricular pressure and heart rate were obtained in order to compute the vascular resistance and to interpret the changes in microcirculation. The web of right hindlimb was immersed in a solution of NC (2.6 to 3.4 mg/ml) for 4 min. Blood flow velocity in web arterioles of left hindlimb was measured by means of a laser Doppler microscope. Internal diameters of web microvessels were determined using a micrometer on the ocular lens of the microscope. Mean flow velocity (MV) and pulsatile amplitude (PA) were calculated from the pulsatile flow-velocity contour for each vessel. Both MV and PA were increased after the immersion of the web in NC solution. Although the magnitude of the increment in MV was proportional to that in ventricular pressure, the vasodilation of both arterioles and venules and the flow rate in arterioles higher than the initial state continued even after the ventricular pressure had returned to the initial control value. Calculation of the relative change in vascular resistance in web arterioles following NC administration suggested a vasodilator response to NC. Furthermore, our results indicate that sufficient NC can be absorbed across the web epithelium to produce a systemic vascular response when the concentration of NC in the bathing solution is 2.6 mg/ml.
ERIC Educational Resources Information Center
Obilade, Titilola T.; Burton, John K.
2015-01-01
This textual content analysis set out to determine the extent to which the theories, principles, and guidelines in 4 standard books of instructional design and technology were also addressed in 4 popular books on web design. The standard books on instructional design and the popular books on web design were chosen by experts in the fields. The…
Multilingual Medical Data Models in ODM Format
Breil, B.; Kenneweg, J.; Fritz, F.; Bruland, P.; Doods, D.; Trinczek, B.; Dugas, M.
2012-01-01
Background Semantic interoperability between routine healthcare and clinical research is an unsolved issue, as information systems in the healthcare domain still use proprietary and site-specific data models. However, information exchange and data harmonization are essential for physicians and scientists if they want to collect and analyze data from different hospitals in order to build up registries and perform multicenter clinical trials. Consequently, there is a need for a standardized metadata exchange based on common data models. Currently this is mainly done by informatics experts instead of medical experts. Objectives We propose to enable physicians to exchange, rate, comment and discuss their own medical data models in a collaborative web-based repository of medical forms in a standardized format. Methods Based on a comprehensive requirement analysis, a web-based portal for medical data models was specified. In this context, a data model is the technical specification (attributes, data types, value lists) of a medical form without any layout information. The CDISC Operational Data Model (ODM) was chosen as the appropriate format for the standardized representation of data models. The system was implemented with Ruby on Rails and applies web 2.0 technologies to provide a community based solution. Forms from different source systems – both routine care and clinical research – were converted into ODM format and uploaded into the portal. Results A portal for medical data models based on ODM-files was implemented (http://www.medical-data-models.org). Physicians are able to upload, comment, rate and download medical data models. More than 250 forms with approximately 8000 items are provided in different views (overview and detailed presentation) and in multiple languages. For instance, the portal contains forms from clinical and research information systems. Conclusion The portal provides a system-independent repository for multilingual data models in ODM format which can be used by physicians. It serves as a platform for discussion and enables the exchange of multilingual medical data models in a standardized way. PMID:23620720
An approach for the semantic interoperability of ISO EN 13606 and OpenEHR archetypes.
Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2010-10-01
The communication between health information systems of hospitals and primary care organizations is currently an important challenge to improve the quality of clinical practice and patient safety. However, clinical information is usually distributed among several independent systems that may be syntactically or semantically incompatible. This fact prevents healthcare professionals from accessing clinical information of patients in an understandable and normalized way. In this work, we address the semantic interoperability of two EHR standards: OpenEHR and ISO EN 13606. Both standards follow the dual model approach which distinguishes information and knowledge, this being represented through archetypes. The solution presented here is capable of transforming OpenEHR archetypes into ISO EN 13606 and vice versa by combining Semantic Web and Model-driven Engineering technologies. The resulting software implementation has been tested using publicly available collections of archetypes for both standards.
Chen, Jengchung Victor; Ross, William H; Yen, David C; Akhapon, Lerdsuwankij
2009-02-01
In this study, three characteristics of Web sites were varied: types of banner ad, Web localization, and involvement in purchasing a product. The dependent variable was attitude toward the site. In laboratory experiments conducted in Thailand and Taiwan, participants browsed versions of a Web site containing different types of banner ads and products. As a within-participants factor, each participant browsed both a standardized English-language Web site and a localized Web site. Results showed that animated (rather than static) banner ads, localized versions (rather than a standardized version) of Web sites, and high (rather than low) product involvement led to favorable attitudes toward the site.
EO Domain Specific Knowledge Enabled Services (KES-B)
NASA Astrophysics Data System (ADS)
Varas, J.; Busto, J.; Torguet, R.
2004-09-01
This paper recovers and describes a number of major statements with respect to the vision, mission and technological approaches of the Technological Research Project (TRP) "EO Domain Specific Knowledge Enabled Services" (project acronym KES-B), which is currently under development at the European Space Research Institute (ESRIN) under contract "16397/02/I- SB". Resulting from the on-going R&D activities, the KES-B project aims are to demonstrate with a prototype system the feasibility of the application of innovative knowledge-based technologies to provide services for easy, scheduled and controlled exploitation of EO resources (e.g.: data, algorithms, procedures, storage, processors, ...), to automate the generation of products, and to support users in easily identifying and accessing the required information or products by using their own vocabulary, domain knowledge and preferences. The ultimate goals of KES-B are summarized in the provision of the two main types of KES services: 1st the Search service (also referred to as Product Exploitation or Information Retrieval; and 2nd the Production service (also referred to as Information Extraction), with the strategic advantage that they are enabled by Knowledge consolidated (formalized) within the system. The KES-B system technical solution approach is driven by a strong commitment for the adoption of industry (XML-based) language standards, aiming to have an interoperable, scalable and flexible operational prototype. In that sense, the Search KES services builds on the basis of the adoption of consolidated and/or emergent W3C semantic-web standards. Remarkably the languages/models Dublin Core (DC), Universal Resource Identifier (URI), Resource Description Framework (RDF) and Ontology Web Language (OWL), and COTS like Protege [1] and JENA [2] are being integrated in the system as building bricks for the construction of the KES based Search services. On the other hand, the Production KES services builds on top of workflow management standards and tools. In this side, the Business Process Execution Language (BPEL), the Web Services Definition Language (WSDL), and the Collaxa [3] COTS tool for workflow management are being integrated for the construction of the KES-B Production Services. The KES-B platform (web portal and web-server) architecture is build on the basis of the J2EE reference architecture. These languages represent the mean for the codification of the different types of knowledge that are to be formalized in the system. This representing the ontological architecture of the system. This shall enable in fact the interoperability with other KES-based systems committing as well to those standards. The motivation behind this vision is pointing towards the construction of the Semantic-web based GRID supply- chain infrastructure for EO-services, in line with the INSPIRE initiative suggestions.
One Method for Inhibiting the Copying of Online Homework
NASA Astrophysics Data System (ADS)
Busch, Hauke
2017-10-01
Over the last several years online homework solutions have become ever more accessible to students. This is due in part to programs like Yahoo Answers, Chegg, publisher solution manuals, and other web resources that are readily available online. The student can easily search any physics homework problem posted on the web in a matter of seconds and have the solution. The results of this are an apparent increase in students copying the answers without solving the problem, which may lead to an increase in homework scores but a reduction in exam scores and an overall lower grade in the class. A secondary effect that may be observed is that tutoring centers, recitations, and supplemental instructor sessions have reduced student attendance. Some might say that the readily available solutions for homework systems such as MasteringPhysics (MP), WebAssign, etc. have greatly diminished them as a teaching tool, and for grading and assessing students' performance in a course. It is the purpose of this paper to offer a possible solution for preventing students from potentially copying online homework solutions.
ERIC Educational Resources Information Center
Dabbagh, Nada; Denisar, Katrina
2005-01-01
For this study, we examined the cogency, comprehensiveness, and viability of team-based problem solutions of a Web-based hypermedia case designed to promote student understanding of the practice of instructional design. Participants were 14 students enrolled in a graduate course on advanced instructional design. The case was presented to students…
One Method for Inhibiting the Copying of Online Homework
ERIC Educational Resources Information Center
Busch, Hauke
2017-01-01
Over the last several years online homework solutions have become ever more accessible to students. This is due in part to programs like Yahoo Answers, Chegg, publisher solution manuals, and other web resources that are readily available online. The student can easily search any physics homework problem posted on the web in a matter of seconds and…
Mobile Monitoring Stations and Web Visualization of Biotelemetric System - Guardian II
NASA Astrophysics Data System (ADS)
Krejcar, Ondrej; Janckulik, Dalibor; Motalova, Leona; Kufel, Jan
The main area of interest of our project is to provide solution which can be used in different areas of health care and which will be available through PDAs (Personal Digital Assistants), web browsers or desktop clients. The realized system deals with an ECG sensor connected to mobile equipment, such as PDA/Embedded, based on Microsoft Windows Mobile operating system. The whole system is based on the architecture of .NET Compact Framework, and Microsoft SQL Server. Visualization possibilities of web interface and ECG data are also discussed and final suggestion is made to Microsoft Silverlight solution along with current screenshot representation of implemented solution. The project was successfully tested in real environment in cryogenic room (-136OC).
SEnviro: a sensorized platform proposal using open hardware and open standards.
Trilles, Sergio; Luján, Alejandro; Belmonte, Óscar; Montoliu, Raúl; Torres-Sospedra, Joaquín; Huerta, Joaquín
2015-03-06
The need for constant monitoring of environmental conditions has produced an increase in the development of wireless sensor networks (WSN). The drive towards smart cities has produced the need for smart sensors to be able to monitor what is happening in our cities. This, combined with the decrease in hardware component prices and the increase in the popularity of open hardware, has favored the deployment of sensor networks based on open hardware. The new trends in Internet Protocol (IP) communication between sensor nodes allow sensor access via the Internet, turning them into smart objects (Internet of Things and Web of Things). Currently, WSNs provide data in different formats. There is a lack of communication protocol standardization, which turns into interoperability issues when connecting different sensor networks or even when connecting different sensor nodes within the same network. This work presents a sensorized platform proposal that adheres to the principles of the Internet of Things and theWeb of Things. Wireless sensor nodes were built using open hardware solutions, and communications rely on the HTTP/IP Internet protocols. The Open Geospatial Consortium (OGC) SensorThings API candidate standard was used as a neutral format to avoid interoperability issues. An environmental WSN developed following the proposed architecture was built as a proof of concept. Details on how to build each node and a study regarding energy concerns are presented.
SEnviro: A Sensorized Platform Proposal Using Open Hardware and Open Standards
Trilles, Sergio; Luján, Alejandro; Belmonte, Óscar; Montoliu, Raúl; Torres-Sospedra, Joaquín; Huerta, Joaquín
2015-01-01
The need for constant monitoring of environmental conditions has produced an increase in the development of wireless sensor networks (WSN). The drive towards smart cities has produced the need for smart sensors to be able to monitor what is happening in our cities. This, combined with the decrease in hardware component prices and the increase in the popularity of open hardware, has favored the deployment of sensor networks based on open hardware. The new trends in Internet Protocol (IP) communication between sensor nodes allow sensor access via the Internet, turning them into smart objects (Internet of Things and Web of Things). Currently, WSNs provide data in different formats. There is a lack of communication protocol standardization, which turns into interoperability issues when connecting different sensor networks or even when connecting different sensor nodes within the same network. This work presents a sensorized platform proposal that adheres to the principles of the Internet of Things and the Web of Things. Wireless sensor nodes were built using open hardware solutions, and communications rely on the HTTP/IP Internet protocols. The Open Geospatial Consortium (OGC) SensorThings API candidate standard was used as a neutral format to avoid interoperability issues. An environmental WSN developed following the proposed architecture was built as a proof of concept. Details on how to build each node and a study regarding energy concerns are presented. PMID:25756864
40 CFR 63.3280 - What is in this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Standards for Hazardous Air Pollutants: Paper and Other Web Coating What This Subpart Covers § 63.3280 What... hazardous air pollutants (HAP) from paper and other web coating operations. This subpart establishes emission standards for web coating lines and specifies what you must do to comply if you own or operate a...
40 CFR 63.825 - Standards: Product and packaging rotogravure and wide-web flexographic printing.
Code of Federal Regulations, 2011 CFR
2011-07-01
... within ±2.0 percent. (vi) Measure the amount of volatile matter recovered for the month. (vii) Calculate... rotogravure and wide-web flexographic printing. 63.825 Section 63.825 Protection of Environment ENVIRONMENTAL... Industry § 63.825 Standards: Product and packaging rotogravure and wide-web flexographic printing. (a) Each...
40 CFR 63.825 - Standards: Product and packaging rotogravure and wide-web flexographic printing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... within ±2.0 percent. (vi) Measure the amount of volatile matter recovered for the month. (vii) Calculate... rotogravure and wide-web flexographic printing. 63.825 Section 63.825 Protection of Environment ENVIRONMENTAL... Industry § 63.825 Standards: Product and packaging rotogravure and wide-web flexographic printing. (a) Each...
40 CFR 63.3280 - What is in this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Standards for Hazardous Air Pollutants: Paper and Other Web Coating What This Subpart Covers § 63.3280 What... hazardous air pollutants (HAP) from paper and other web coating operations. This subpart establishes emission standards for web coating lines and specifies what you must do to comply if you own or operate a...
40 CFR 63.3321 - What operating limits must I meet?
Code of Federal Regulations, 2012 CFR
2012-07-01
...) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating Emission Standards and Compliance Dates § 63.3321 What operating limits must I meet? (a) For any web coating line or group of web coating lines for which you use add-on control devices, unless you use a solvent recovery...
40 CFR 63.3321 - What operating limits must I meet?
Code of Federal Regulations, 2013 CFR
2013-07-01
...) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating Emission Standards and Compliance Dates § 63.3321 What operating limits must I meet? (a) For any web coating line or group of web coating lines for which you use add-on control devices, unless you use a solvent recovery...
40 CFR 63.3321 - What operating limits must I meet?
Code of Federal Regulations, 2014 CFR
2014-07-01
...) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating Emission Standards and Compliance Dates § 63.3321 What operating limits must I meet? (a) For any web coating line or group of web coating lines for which you use add-on control devices, unless you use a solvent recovery...
A web-based rapid assessment tool for production publishing solutions
NASA Astrophysics Data System (ADS)
Sun, Tong
2010-02-01
Solution assessment is a critical first-step in understanding and measuring the business process efficiency enabled by an integrated solution package. However, assessing the effectiveness of any solution is usually a very expensive and timeconsuming task which involves lots of domain knowledge, collecting and understanding the specific customer operational context, defining validation scenarios and estimating the expected performance and operational cost. This paper presents an intelligent web-based tool that can rapidly assess any given solution package for production publishing workflows via a simulation engine and create a report for various estimated performance metrics (e.g. throughput, turnaround time, resource utilization) and operational cost. By integrating the digital publishing workflow ontology and an activity based costing model with a Petri-net based workflow simulation engine, this web-based tool allows users to quickly evaluate any potential digital publishing solutions side-by-side within their desired operational contexts, and provides a low-cost and rapid assessment for organizations before committing any purchase. This tool also benefits the solution providers to shorten the sales cycles, establishing a trustworthy customer relationship and supplement the professional assessment services with a proven quantitative simulation and estimation technology.
Comparison of Web-Based and Face-to-Face Standard Setting Using the Angoff Method
ERIC Educational Resources Information Center
Katz, Irvin R.; Tannenbaum, Richard J.
2014-01-01
Web-based standard setting holds promise for reducing the travel and logistical inconveniences of traditional, face-to-face standard setting meetings. However, because there are few published reports of setting standards via remote meeting technology, little is known about the practical potential of the approach, including technical feasibility of…
Web Content Management and One EPA Web Factsheet
One EPA Web is a multi-year project to improve EPA’s website to better meet the needs of our Web visitors. Content is developed and managed in the WebCMS which supports One EPA Web goals by standardizing how we create and publish content.
ERIC Educational Resources Information Center
Dodge, Lucy
The report describes San Jose College's (California) two Web site management and design programs, and provides employment information and job market analysis for the field. The College's Web Site Administration and Web Application Solutions programs offer classes designed to give students the necessary skills in administering a Web site and in…
PaaS for web applications with OpenShift Origin
NASA Astrophysics Data System (ADS)
Lossent, A.; Rodriguez Peon, A.; Wagner, A.
2017-10-01
The CERN Web Frameworks team has deployed OpenShift Origin to facilitate deployment of web applications and to improving efficiency in terms of computing resource usage. OpenShift leverages Docker containers and Kubernetes orchestration to provide a Platform-as-a-service solution oriented for web applications. We will review use cases and how OpenShift was integrated with other services such as source control, web site management and authentication services.
Web Camera Based Eye Tracking to Assess Visual Memory on a Visual Paired Comparison Task.
Bott, Nicholas T; Lange, Alex; Rentz, Dorene; Buffalo, Elizabeth; Clopton, Paul; Zola, Stuart
2017-01-01
Background: Web cameras are increasingly part of the standard hardware of most smart devices. Eye movements can often provide a noninvasive "window on the brain," and the recording of eye movements using web cameras is a burgeoning area of research. Objective: This study investigated a novel methodology for administering a visual paired comparison (VPC) decisional task using a web camera.To further assess this method, we examined the correlation between a standard eye-tracking camera automated scoring procedure [obtaining images at 60 frames per second (FPS)] and a manually scored procedure using a built-in laptop web camera (obtaining images at 3 FPS). Methods: This was an observational study of 54 clinically normal older adults.Subjects completed three in-clinic visits with simultaneous recording of eye movements on a VPC decision task by a standard eye tracker camera and a built-in laptop-based web camera. Inter-rater reliability was analyzed using Siegel and Castellan's kappa formula. Pearson correlations were used to investigate the correlation between VPC performance using a standard eye tracker camera and a built-in web camera. Results: Strong associations were observed on VPC mean novelty preference score between the 60 FPS eye tracker and 3 FPS built-in web camera at each of the three visits ( r = 0.88-0.92). Inter-rater agreement of web camera scoring at each time point was high (κ = 0.81-0.88). There were strong relationships on VPC mean novelty preference score between 10, 5, and 3 FPS training sets ( r = 0.88-0.94). Significantly fewer data quality issues were encountered using the built-in web camera. Conclusions: Human scoring of a VPC decisional task using a built-in laptop web camera correlated strongly with automated scoring of the same task using a standard high frame rate eye tracker camera.While this method is not suitable for eye tracking paradigms requiring the collection and analysis of fine-grained metrics, such as fixation points, built-in web cameras are a standard feature of most smart devices (e.g., laptops, tablets, smart phones) and can be effectively employed to track eye movements on decisional tasks with high accuracy and minimal cost.
Otero, P; Hersh, W
2011-01-01
Web 3.0 is transforming the World Wide Web by allowing knowledge and reasoning to be gleaned from its content. Describe a new scenario in education and training known as "Education 3.0" that can help in the promotion of learning in health informatics in a collaborative way. Review of the current standards available for curricula and learning activities in in Biomedical and Health Informatics (BMHI) for a Web 3.0 scenario. A new scenario known as "Education 3.0" can provide open educational resources created and reused throughout different institutions and improved by means of an international collaborative knowledge powered by the use of E-learning. Currently there are standards that could be used in identifying and deliver content in education in BMHI in the semantic web era such as Resource Description Format (RDF), Web Ontology Language (OWL) and Sharable Content Object Reference Model (SCORM). In addition, there are other standards to support healthcare education and training. There are few experiences in the use of standards in e-learning in BMHI published in the literature. Web 3.0 can propose new approaches to building the BMHI workforce so there is a need to build tools as knowledge infrastructure to leverage it. The usefulness of standards in the content and competencies of training programs in BMHI needs more experience and research so as to promote the interoperability and sharing of resources in this growing discipline.
Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José
2012-07-01
This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.
Additives influence on spinning solution and nano web properties
NASA Astrophysics Data System (ADS)
Kukle, S.; Jegina, S.; Sutka, A.; Makovska, R.
2017-10-01
Needleless electrospinning operated as a one-stage process producing nanofibres webs from spinning solutions with the corresponding to the final use properties seems has a good future prospects. Complicated spinning solution designing started with the selection of composition and components proportion, pre-processing sequence and parameters establishing for every component and for their mixing. Spinning solution viscosity and electro conductivity together with the spinning distance and intensity of electromagnetic field are main parameters determined spin ability and properties of obtained nanofibers. Influence of some pre-processing parameters of components, combinations of organic and non-organic components and their concentration influence on spinning solution viscosity and conductivity, as well on fibres diameters are under discussion.
DICOMweb™: Background and Application of the Web Standard for Medical Imaging.
Genereaux, Brad W; Dennison, Donald K; Ho, Kinson; Horn, Robert; Silver, Elliot Lewis; O'Donnell, Kevin; Kahn, Charles E
2018-05-10
This paper describes why and how DICOM, the standard that has been the basis for medical imaging interoperability around the world for several decades, has been extended into a full web technology-based standard, DICOMweb. At the turn of the century, healthcare embraced information technology, which created new problems and new opportunities for the medical imaging industry; at the same time, web technologies matured and began serving other domains well. This paper describes DICOMweb, how it extended the DICOM standard, and how DICOMweb can be applied to problems facing healthcare applications to address workflow and the changing healthcare climate.
47 CFR 73.8000 - Incorporation by reference.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Engineering and Technology (OET) Web site: http://www.fcc.gov/oet/info/documents/bulletins/. (1) OET Bulletin...., Suite 1200, Washington, DC 20006, or at the ATSC Web site: http://www.atsc.org/standards.html. (1) ATSC... Standards Institute (ANSI), 25 West 43rd Street, 4th Floor, New York, NY 10036 or at the ANSI Web site: http...
40 CFR 63.3321 - What operating limits must I meet?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Standards for Hazardous Air Pollutants: Paper and Other Web Coating Emission Standards and Compliance Dates § 63.3321 What operating limits must I meet? (a) For any web coating line or group of web coating lines for which you use add-on control devices, unless you use a solvent recovery system and conduct a...
40 CFR 63.3370 - How do I demonstrate compliance with the emission standards?
Code of Federal Regulations, 2010 CFR
2010-07-01
... SOURCE CATEGORIES National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating... material, i, in a month, kg. Mvret = Mass of volatile matter retained in the coated web after curing or...-purchased coating material, i, in a month, kg. Mvret = Mass of volatile matter retained in the coated web...
Exploring NASA GES DISC Data with Interoperable Services
NASA Technical Reports Server (NTRS)
Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey
2015-01-01
Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.
The value of the Semantic Web in the laboratory.
Frey, Jeremy G
2009-06-01
The Semantic Web is beginning to impact on the wider chemical and physical sciences, beyond the earlier adopted bio-informatics. While useful in large-scale data driven science with automated processing, these technologies can also help integrate the work of smaller scale laboratories producing diverse data. The semantics aid the discovery, reliable re-use of data, provide improved provenance and facilitate automated processing by increased resilience to changes in presentation and reduced ambiguity. The Semantic Web, its tools and collections are not yet competitive with well-established solutions to current problems. It is in the reduced cost of instituting solutions to new problems that the versatility of Semantic Web-enabled data and resources will make their mark once the more general-purpose tools are more available.
The PubChem chemical structure sketcher
2009-01-01
PubChem is an important public, Web-based information source for chemical and bioactivity information. In order to provide convenient structure search methods on compounds stored in this database, one mandatory component is a Web-based drawing tool for interactive sketching of chemical query structures. Web-enabled chemical structure sketchers are not new, being in existence for years; however, solutions available rely on complex technology like Java applets or platform-dependent plug-ins. Due to general policy and support incident rate considerations, Java-based or platform-specific sketchers cannot be deployed as a part of public NCBI Web services. Our solution: a chemical structure sketching tool based exclusively on CGI server processing, client-side JavaScript functions, and image sequence streaming. The PubChem structure editor does not require the presence of any specific runtime support libraries or browser configurations on the client. It is completely platform-independent and verified to work on all major Web browsers, including older ones without support for Web2.0 JavaScript objects. PMID:20298522
Effects of electric field on the maximum electro-spinning rate of silk fibroin solutions.
Park, Bo Kyung; Um, In Chul
2017-02-01
Owing to the excellent cyto-compatibility of silk fibroin (SF) and the simple fabrication of nano-fibrous webs, electro-spun SF webs have attracted much research attention in numerous biomedical fields. Because the production rate of electro-spun webs is strongly dependent on the electro-spinning rate used, the electro-spinning rate becomes more important. In the present study, to improve the electro-spinning rate of SF solutions, various electric fields were applied during electro-spinning of SF, and its effects on the maximum electro-spinning rate of SF solution as well as diameters and molecular conformations of the electro-spun SF fibers were examined. As the electric field was increased, the maximum electro-spinning rate of the SF solution also increased. The maximum electro-spinning rate of a 13% SF solution could be increased 12×by increasing the electric field from 0.5kV/cm (0.25mL/h) to 2.5kV/cm (3.0mL/h). The dependence of the fiber diameter on the present electric field was not significant when using less-concentrated SF solutions (7-9% SF). On the other hand, at higher SF concentrations the electric field had a greater effect on the resulting fiber diameter. The electric field had a minimal effect of the molecular conformation and crystallinity index of the electro-spun SF webs. Copyright © 2016 Elsevier B.V. All rights reserved.
Developing a GIS for CO2 analysis using lightweight, open source components
NASA Astrophysics Data System (ADS)
Verma, R.; Goodale, C. E.; Hart, A. F.; Kulawik, S. S.; Law, E.; Osterman, G. B.; Braverman, A.; Nguyen, H. M.; Mattmann, C. A.; Crichton, D. J.; Eldering, A.; Castano, R.; Gunson, M. R.
2012-12-01
There are advantages to approaching the realm of geographic information systems (GIS) using lightweight, open source components in place of a more traditional web map service (WMS) solution. Rapid prototyping, schema-less data storage, the flexible interchange of components, and open source community support are just some of the benefits. In our effort to develop an application supporting the geospatial and temporal rendering of remote sensing carbon-dioxide (CO2) data for the CO2 Virtual Science Data Environment project, we have connected heterogeneous open source components together to form a GIS. Utilizing widely popular open source components including the schema-less database MongoDB, Leaflet interactive maps, the HighCharts JavaScript graphing library, and Python Bottle web-services, we have constructed a system for rapidly visualizing CO2 data with reduced up-front development costs. These components can be aggregated together, resulting in a configurable stack capable of replicating features provided by more standard GIS technologies. The approach we have taken is not meant to replace the more established GIS solutions, but to instead offer a rapid way to provide GIS features early in the development of an application and to offer a path towards utilizing more capable GIS technology in the future.
Nutrient-mediated architectural plasticity of a predatory trap.
Blamires, Sean J; Tso, I-Min
2013-01-01
Nutrients such as protein may be actively sought by foraging animals. Many predators exhibit foraging plasticity, but how their foraging strategies are affected when faced with nutrient deprivation is largely unknown. In spiders, the assimilation of protein into silk may be in conflict with somatic processes so we predicted web building to be affected under protein depletion. To assess the influence of protein intake on foraging plasticity we fed the orb-web spiders Argiope aemula and Cyclosa mulmeinensis high, low or no protein solutions over 10 days and allowed them to build webs. We compared post-feeding web architectural components and major ampullate (MA) silk amino acid compositions. We found that the number of radii in webs increased in both species when fed high protein solutions. Mesh size increased in A. aemula when fed a high protein solution. MA silk proline and alanine compositions varied in each species with contrasting variations in alanine between the two species. Glycine compositions only varied in C. mulmeinensis silk. No spiders significantly lost or gained mass on any feeding treatment, so they did not sacrifice somatic maintenance for amino acid investment in silk. Our results show that the amount of protein taken in significantly affects the foraging decisions of trap-building predators, such as orb web spiders. Nevertheless, the subtle differences found between species in the association between protein intake, the amino acids invested in silk and web architectural plasticity show that the influence of protein deprivation on specific foraging strategies differs among different spiders.
A Web-based approach to blood donor preparation.
France, Christopher R; France, Janis L; Kowalsky, Jennifer M; Copley, Diane M; Lewis, Kristin N; Ellis, Gary D; McGlone, Sarah T; Sinclair, Kadian S
2013-02-01
Written and video approaches to donor education have been shown to enhance donation attitudes and intentions to give blood, particularly when the information provides specific coping suggestions for donation-related concerns. This study extends this work by comparing Web-based approaches to donor preparation among donors and nondonors. Young adults (62% female; mean [±SD] age, 19.3 [±1.5] years; mean [range] number of prior blood donations, 1.1 [0-26]; 60% nondonors) were randomly assigned to view 1) a study Web site designed to address common blood donor concerns and suggest specific coping strategies (n = 238), 2) a standard blood center Web site (n = 233), or 3) a control Web site where participants viewed videos of their choice (n = 202). Measures of donation attitude, anxiety, confidence, intention, anticipated regret, and moral norm were completed before and after the intervention. Among nondonors, the study Web site produced greater changes in donation attitude, confidence, intention, and anticipated regret relative to both the standard and the control Web sites, but only differed significantly from the control Web site for moral norm and anxiety. Among donors, the study Web site produced greater changes in donation confidence and anticipated regret relative to both the standard and the control Web sites, but only differed significantly from the control Web site for donation attitude, anxiety, intention, and moral norm. Web-based donor preparation materials may provide a cost-effective way to enhance donation intentions and encourage donation behavior. © 2012 American Association of Blood Banks.
A Brief Introduction to Web-Based Note Capture
ERIC Educational Resources Information Center
Ovadia, Steven
2012-01-01
While physical notebooks and locally saved electronic files are certainly helpful, there are a number of web-based solutions that might be useful to someone conducting research online, or looking to hold their notes in a web-based environment. The main advantage of a web-based note capture tool is that one is able to access it from just about…
Key-phrase based classification of public health web pages.
Dolamic, Ljiljana; Boyer, Célia
2013-01-01
This paper describes and evaluates the public health web pages classification model based on key phrase extraction and matching. Easily extendible both in terms of new classes as well as the new language this method proves to be a good solution for text classification faced with the total lack of training data. To evaluate the proposed solution we have used a small collection of public health related web pages created by a double blind manual classification. Our experiments have shown that by choosing the adequate threshold value the desired value for either precision or recall can be achieved.
An Interoperable Architecture for Air Pollution Early Warning System Based on Sensor Web
NASA Astrophysics Data System (ADS)
Samadzadegan, F.; Zahmatkesh, H.; Saber, M.; Ghazi khanlou, H. J.
2013-09-01
Environmental monitoring systems deal with time-sensitive issues which require quick responses in emergency situations. Handling the sensor observations in near real-time and obtaining valuable information is challenging issues in these systems from a technical and scientific point of view. The ever-increasing population growth in urban areas has caused certain problems in developing countries, which has direct or indirect impact on human life. One of applicable solution for controlling and managing air quality by considering real time and update air quality information gathered by spatially distributed sensors in mega cities, using sensor web technology for developing monitoring and early warning systems. Urban air quality monitoring systems using functionalities of geospatial information system as a platform for analysing, processing, and visualization of data in combination with Sensor Web for supporting decision support systems in disaster management and emergency situations. This system uses Sensor Web Enablement (SWE) framework of the Open Geospatial Consortium (OGC), which offers a standard framework that allows the integration of sensors and sensor data into spatial data infrastructures. SWE framework introduces standards for services to access sensor data and discover events from sensor data streams as well as definition set of standards for the description of sensors and the encoding of measurements. The presented system provides capabilities to collect, transfer, share, process air quality sensor data and disseminate air quality status in real-time. It is possible to overcome interoperability challenges by using standard framework. In a routine scenario, air quality data measured by in-situ sensors are communicated to central station where data is analysed and processed. The extracted air quality status is processed for discovering emergency situations, and if necessary air quality reports are sent to the authorities. This research proposed an architecture to represent how integrate air quality sensor data stream into geospatial data infrastructure to present an interoperable air quality monitoring system for supporting disaster management systems by real time information. Developed system tested on Tehran air pollution sensors for calculating Air Quality Index (AQI) for CO pollutant and subsequently notifying registered users in emergency cases by sending warning E-mails. Air quality monitoring portal used to retrieving and visualize sensor observation through interoperable framework. This system provides capabilities to retrieve SOS observation using WPS in a cascaded service chaining pattern for monitoring trend of timely sensor observation.
77 FR 42197 - Small Business Size Standards: Construction
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-18
... ``exception'') under NAICS 237990, Other Heavy and Civil Engineering Construction, from $20 million to $30... available on its Web site at www.sba.gov/size for public review and comments. The ``Size Standards... developing, reviewing, and modifying size standards when necessary. SBA published the document on its Web...
da Silva, Kátia Regina; Costa, Roberto; Crevelari, Elizabeth Sartori; Lacerda, Marianna Sobral; de Moraes Albertini, Caio Marcos; Filho, Martino Martinelli; Santana, José Eduardo; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo; Barros, Jacson V
2013-01-01
The ability to apply standard and interoperable solutions for implementing and managing medical registries as well as aggregate, reproduce, and access data sets from legacy formats and platforms to advanced standard formats and operating systems are crucial for both clinical healthcare and biomedical research settings. Our study describes a reproducible, highly scalable, standard framework for a device registry implementation addressing both local data quality components and global linking problems. We developed a device registry framework involving the following steps: (1) Data standards definition and representation of the research workflow, (2) Development of electronic case report forms using REDCap (Research Electronic Data Capture), (3) Data collection according to the clinical research workflow and, (4) Data augmentation by enriching the registry database with local electronic health records, governmental database and linked open data collections, (5) Data quality control and (6) Data dissemination through the registry Web site. Our registry adopted all applicable standardized data elements proposed by American College Cardiology / American Heart Association Clinical Data Standards, as well as variables derived from cardiac devices randomized trials and Clinical Data Interchange Standards Consortium. Local interoperability was performed between REDCap and data derived from Electronic Health Record system. The original data set was also augmented by incorporating the reimbursed values paid by the Brazilian government during a hospitalization for pacemaker implantation. By linking our registry to the open data collection repository Linked Clinical Trials (LinkedCT) we found 130 clinical trials which are potentially correlated with our pacemaker registry. This study demonstrates how standard and reproducible solutions can be applied in the implementation of medical registries to constitute a re-usable framework. Such approach has the potential to facilitate data integration between healthcare and research settings, also being a useful framework to be used in other biomedical registries.
Enterprise-scale image distribution with a Web PACS.
Gropper, A; Doyle, S; Dreyer, K
1998-08-01
The integration of images with existing and new health care information systems poses a number of challenges in a multi-facility network: image distribution to clinicians; making DICOM image headers consistent across information systems; and integration of teleradiology into PACS. A novel, Web-based enterprise PACS architecture introduced at Massachusetts General Hospital provides a solution. Four AMICAS Web/Intranet Image Servers were installed as the default DICOM destination of 10 digital modalities. A fifth AMICAS receives teleradiology studies via the Internet. Each AMICAS includes: a Java-based interface to the IDXrad radiology information system (RIS), a DICOM autorouter to tape-library archives and to the Agfa PACS, a wavelet image compressor/decompressor that preserves compatibility with DICOM workstations, a Web server to distribute images throughout the enterprise, and an extensible interface which permits links between other HIS and AMICAS. Using wavelet compression and Internet standards as its native formats, AMICAS creates a bridge to the DICOM networks of remote imaging centers via the Internet. This teleradiology capability is integrated into the DICOM network and the PACS thereby eliminating the need for special teleradiology workstations. AMICAS has been installed at MGH since March of 1997. During that time, it has been a reliable component of the evolving digital image distribution system. As a result, the recently renovated neurosurgical ICU will be filmless and use only AMICAS workstations for mission-critical patient care.
Going, going, still there: using the WebCite service to permanently archive cited Web pages.
Eysenbach, Gunther
2006-01-01
Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page.
An Introduction to Web Accessibility, Web Standards, and Web Standards Makers
ERIC Educational Resources Information Center
McHale, Nina
2011-01-01
Librarians and libraries have long been committed to providing equitable access to information. In the past decade and a half, the growth of the Internet and the rapid increase in the number of online library resources and tools have added a new dimension to this core duty of the profession: ensuring accessibility of online resources to users with…
Leveraging the Cloud for Integrated Network Experimentation
2014-03-01
kernel settings, or any of the low-level subcomponents. 3. Scalable Solutions: Businesses can build scalable solutions for their clients , ranging from...values. These values 13 can assume several distributions that include normal, Pareto , uniform, exponential and Poisson, among others [21]. Additionally, D...communication, the web client establishes a connection to the server before traffic begins to flow. Web servers do not initiate connections to clients in
SAS- Semantic Annotation Service for Geoscience resources on the web
NASA Astrophysics Data System (ADS)
Elag, M.; Kumar, P.; Marini, L.; Li, R.; Jiang, P.
2015-12-01
There is a growing need for increased integration across the data and model resources that are disseminated on the web to advance their reuse across different earth science applications. Meaningful reuse of resources requires semantic metadata to realize the semantic web vision for allowing pragmatic linkage and integration among resources. Semantic metadata associates standard metadata with resources to turn them into semantically-enabled resources on the web. However, the lack of a common standardized metadata framework as well as the uncoordinated use of metadata fields across different geo-information systems, has led to a situation in which standards and related Standard Names abound. To address this need, we have designed SAS to provide a bridge between the core ontologies required to annotate resources and information systems in order to enable queries and analysis over annotation from a single environment (web). SAS is one of the services that are provided by the Geosematnic framework, which is a decentralized semantic framework to support the integration between models and data and allow semantically heterogeneous to interact with minimum human intervention. Here we present the design of SAS and demonstrate its application for annotating data and models. First we describe how predicates and their attributes are extracted from standards and ingested in the knowledge-base of the Geosemantic framework. Then we illustrate the application of SAS in annotating data managed by SEAD and annotating simulation models that have web interface. SAS is a step in a broader approach to raise the quality of geoscience data and models that are published on the web and allow users to better search, access, and use of the existing resources based on standard vocabularies that are encoded and published using semantic technologies.
Moby and Moby 2: creatures of the deep (web).
Vandervalk, Ben P; McCarthy, E Luke; Wilkinson, Mark D
2009-03-01
Facile and meaningful integration of data from disparate resources is the 'holy grail' of bioinformatics. Some resources have begun to address this problem by providing their data using Semantic Web standards, specifically the Resource Description Framework (RDF) and the Web Ontology Language (OWL). Unfortunately, adoption of Semantic Web standards has been slow overall, and even in cases where the standards are being utilized, interconnectivity between resources is rare. In response, we have seen the emergence of centralized 'semantic warehouses' that collect public data from third parties, integrate it, translate it into OWL/RDF and provide it to the community as a unified and queryable resource. One limitation of the warehouse approach is that queries are confined to the resources that have been selected for inclusion. A related problem, perhaps of greater concern, is that the majority of bioinformatics data exists in the 'Deep Web'-that is, the data does not exist until an application or analytical tool is invoked, and therefore does not have a predictable Web address. The inability to utilize Uniform Resource Identifiers (URIs) to address this data is a barrier to its accessibility via URI-centric Semantic Web technologies. Here we examine 'The State of the Union' for the adoption of Semantic Web standards in the health care and life sciences domain by key bioinformatics resources, explore the nature and connectivity of several community-driven semantic warehousing projects, and report on our own progress with the CardioSHARE/Moby-2 project, which aims to make the resources of the Deep Web transparently accessible through SPARQL queries.
Interoperability in planetary research for geospatial data analysis
NASA Astrophysics Data System (ADS)
Hare, Trent M.; Rossi, Angelo P.; Frigeri, Alessandro; Marmo, Chiara
2018-01-01
For more than a decade there has been a push in the planetary science community to support interoperable methods for accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (e.g., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized geospatial image formats, geologic mapping conventions, U.S. Federal Geographic Data Committee (FGDC) cartographic and metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Map Tile Services (cached image tiles), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they can be just as valuable for planetary domain. Another initiative, called VESPA (Virtual European Solar and Planetary Access), will marry several of the above geoscience standards and astronomy-based standards as defined by International Virtual Observatory Alliance (IVOA). This work outlines the current state of interoperability initiatives in use or in the process of being researched within the planetary geospatial community.
Medclic: the Mediterranean in one click
NASA Astrophysics Data System (ADS)
Troupin, Charles; Frontera, Biel; Sebastián, Kristian; Pau Beltran, Joan; Krietemeyer, Andreas; Gómara, Sonia; Gomila, Mikel; Escudier, Romain; Juza, Mélanie; Mourre, Baptiste; Garau, Angels; Cañellas, Tomeu; Tintoré, Joaquín
2016-04-01
"Medclic: the Mediterranean in one click" is a research and dissemination project focused on the scientific, technological and societal approaches of the Balearic Islands Coastal Observing and Forecasting System ({SOCIB}{www.socib.es}) in a collaboration with "la Caixa" Foundation. SOCIB aims at research excellence and the development of technology which enables progress toward the sustainable management of coastal and marine environments, providing solutions to meet the needs of society. Medclic goes one step forward and has two main goals: at the scientific level, to advance in establishing and understanding the mesoscale variability at the regional scale and its interaction, and thus improving the characterisation of the "oceanic weather" in the Mediterranean; at the outreach level: to bring SOCIB and the new paradigm of multi-platform observation in real time closer to society, through scientific outreach. SOCIB Data Centre is the core of the new multi-platform and real time oceanography and is responsible for directing the different stages of data management, ranging from data acquisition to its distribution and visualization through web applications. The system implemented relies on open source solutions and provides data in line with international standards and conventions (INSPIRE, netCDF Climate and Forecast, ldots). In addition, the Data Centre has implemented a REST web service, called Data Discovery. This service allows data generated by SOCIB to be integrated into applications developed by the Data Centre itself or by third parties, as it is the case with Medclic. Relying on this data distribution, the new web Medclic, www.medclic.es, constitutes an interactive scientific and educational area of communication that contributes to the rapprochement of the general public with the new marine and coastal observing technologies. Thanks to the Medclic web, data coming from new observing technologies in oceanography are available in real time and in one clic for all the society. Exploring different observing systems, knowing the temperature and swell forecasts, and discovering the importance of oceanographic research will be possible in a playful and interactive way.
MExLab Planetary Geoportal: 3D-access to planetary images and results of spatial data analysis
NASA Astrophysics Data System (ADS)
Karachevtseva, I.; Garov, A.
2015-10-01
MExLab Planetary Geoportal was developed as Geodesy and Cartography Node which provide access to results of study of celestial bodies such as DEM and orthoimages, as well as basemaps, crater catalogues and derivative products: slope, roughness, crater density (http://cartsrv.mexlab.ru/geoportal). The main feature of designed Geoportal is the ability of spatial queries and access to the contents selecting from the list of available data set (Phobos, Mercury, Moon, including Lunokhod's archive data). Prior version of Geoportal has been developed using Flash technology. Now we are developing new version which will use 3D-API (OpenGL, WebGL) based on shaders not only for standard 3D-functionality, but for 2D-mapping as well. Users can obtain quantitative and qualitative characteristics of the objects in graphical, tabular and 3D-forms. It will bring the advantages of unification of code and speed of processing and provide a number of functional advantages based on GIS-tools such as: - possibility of dynamic raster transform for needed map projection; - effective implementation of the co-registration of planetary images by combining spatial data geometries; - presentation in 3D-form different types of data, including planetary atmospheric measurements, subsurface radar data, ect. The system will be created with a new software architecture, which has a potential for development and flexibility in reconfiguration based on cross platform solution: - an application for the three types of platforms: desktop (Windows, Linux, OSX), web platform (any HTML5 browser), and mobile application (Android, iOS); - a single codebase shared between platforms (using cross compilation for Web); - a new telecommunication solution to connect between modules and external system like PROVIDE WebGIS (http://www.provide-space.eu/progis/). The research leading to these result was partly supported by the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 312377 PRoViDE.
NASA Astrophysics Data System (ADS)
Čepický, Jáchym; Moreira de Sousa, Luís
2016-06-01
The OGC® Web Processing Service (WPS) Interface Standard provides rules for standardizing inputs and outputs (requests and responses) for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates publishing of geospatial processes and client discovery of processes and and binding to those processes into workflows. Data required by a WPS can be delivered across a network or they can be available at a server. PyWPS was one of the first implementations of OGC WPS on the server side. It is written in the Python programming language and it tries to connect to all existing tools for geospatial data analysis, available on the Python platform. During the last two years, the PyWPS development team has written a new version (called PyWPS-4) completely from scratch. The analysis of large raster datasets poses several technical issues in implementing the WPS standard. The data format has to be defined and validated on the server side and binary data have to be encoded using some numeric representation. Pulling raster data from remote servers introduces security risks, in addition, running several processes in parallel has to be possible, so that system resources are used efficiently while preserving security. Here we discuss these topics and illustrate some of the solutions adopted within the PyWPS implementation.
Desveaux, Laura; Shaw, James; Saragosa, Marianne; Soobiah, Charlene; Marani, Husayn; Hensel, Jennifer; Agarwal, Payal; Onabajo, Nike; Bhatia, R Sacha; Jeffs, Lianne
2018-03-16
The increasing use of Web-based solutions for health prevention and promotion presents opportunities to improve self-management and adherence to guideline-based therapy for individuals with type 2 diabetes (T2DM). Despite promising preliminary evidence, many users stop using Web-based solutions due to the burden of data entry, hidden costs, loss of interest, and a lack of comprehensive features. Evaluations tend to focus on effectiveness or impact and fail to evaluate the nuanced variables that may interact to contribute to outcome success (or failure). This study aimed to evaluate a Web-based solution for improving self-management in T2DM to identify key combinations of contextual variables and mechanisms of action that explain for whom the solution worked best and in what circumstances. A qualitative realist evaluation was conducted with one-on-one, semistructured telephonic interviews completed at baseline, and again toward the end of the intervention period (3 months). Topics included participants' experiences of using the Web-based solution, barriers and facilitators of self-management, and barriers and facilitators to effective use. Transcripts were analyzed using thematic analysis strategies, after which the key themes were used to develop statements of the relationships between the key contextual factors, mechanisms of action, and impact on the primary outcome (glycated hemoglobin, HbA 1c ). Twenty-six interviews (14 baseline, 12 follow-up) were completed with 16 participants with T2DM, and the following 3 key groups emerged: the easiest fit, the best fit, and those who failed to activate. Self-efficacy and willingness to engage with the solution facilitated improvement in HbA 1c , whereas competing priorities and psychosocial issues created barriers to engagement. Individuals with high baseline self-efficacy who were motivated, took ownership for their actions, and prioritized diabetes management were early and eager adopters of the app and recorded improvements in HbA 1c over the intervention period. Individuals with moderate baseline self-efficacy and no competing priorities, who identified gaps in understanding of how their actions influence their health, were slow to adopt use but recorded the greatest improvements in HbA 1c . The final group had low baseline self-efficacy and identified a range of psychosocial issues and competing priorities. These participants were uncertain of the benefits of using a Web-based solution to support self-management, ultimately resulting in minimal engagement and no improvement in HbA 1c . Self-efficacy, competing priorities, previous behavior change, and beliefs about Web-based solutions interact to determine engagement and impact on the clinical outcomes. Considering the balance of these patient characteristics is likely to help health care providers identify individuals who are apt to benefit from a Web-based solution to support self-management of T2DM. Web-based solutions could be modified to incorporate the existing screening measures to identify individuals who are at risk of suboptimal adherence to inform the provision of additional support(s) as needed. ©Laura Desveaux, James Shaw, Marianne Saragosa, Charlene Soobiah, Husayn Marani, Jennifer Hensel, Payal Agarwal, Nike Onabajo, R Sacha Bhatia, Lianne Jeffs. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 16.03.2018.
Roles and Responsibilities for Web Council Members
Members represent their Region or AAship on the Web Council, act as a primary point of contact, coordinate Regional/AAship web development within broader Agency efforts including One EPA Web standards and best practices, and have other responsibilites.
pWeb: A High-Performance, Parallel-Computing Framework for Web-Browser-Based Medical Simulation.
Halic, Tansel; Ahn, Woojin; De, Suvranu
2014-01-01
This work presents a pWeb - a new language and compiler for parallelization of client-side compute intensive web applications such as surgical simulations. The recently introduced HTML5 standard has enabled creating unprecedented applications on the web. Low performance of the web browser, however, remains the bottleneck of computationally intensive applications including visualization of complex scenes, real time physical simulations and image processing compared to native ones. The new proposed language is built upon web workers for multithreaded programming in HTML5. The language provides fundamental functionalities of parallel programming languages as well as the fork/join parallel model which is not supported by web workers. The language compiler automatically generates an equivalent parallel script that complies with the HTML5 standard. A case study on realistic rendering for surgical simulations demonstrates enhanced performance with a compact set of instructions.
JavaScript Access to DICOM Network and Objects in Web Browser.
Drnasin, Ivan; Grgić, Mislav; Gogić, Goran
2017-10-01
Digital imaging and communications in medicine (DICOM) 3.0 standard provides the baseline for the picture archiving and communication systems (PACS). The development of Internet and various communication media initiated demand for non-DICOM access to PACS systems. Ever-increasing utilization of the web browsers, laptops and handheld devices, as opposed to desktop applications and static organizational computers, lead to development of different web technologies. The DICOM standard officials accepted those subsequently as tools of alternative access. This paper provides an overview of the current state of development of the web access technology to the DICOM repositories. It presents a different approach of using HTML5 features of the web browsers through the JavaScript language and the WebSocket protocol by enabling real-time communication with DICOM repositories. JavaScript DICOM network library, DICOM to WebSocket proxy and a proof-of-concept web application that qualifies as a DICOM 3.0 device were developed.
An open source, web based, simple solution for seismic data dissemination and collaborative research
NASA Astrophysics Data System (ADS)
Diviacco, Paolo
2005-06-01
Collaborative research and data dissemination in the field of geophysical exploration need network tools that can access large amounts of data from anywhere using any PC or workstation. Simple solutions based on a combination of Open Source software can be developed to address such requests, exploiting the possibilities offered by the web technologies, and at the same time avoiding the costs and inflexibility of commercial systems. A viable solution consists of MySQL for data storage and retrieval, CWP/SU and GMT for data visualisation and a scripting layer driven by PHP that allows users to access the system via an Apache web server. In the light of the experience building the on-line archive of seismic data of the Istituto Nazionale di Oceanografia e di Geofisica Sperimentale (OGS), we describe the solutions and the methods adopted, with a view to stimulate both the attitude of network collaborative research of other institutions similar to ours, and the development of different applications.
Pragmatic Computing - A Semiotic Perspective to Web Services
NASA Astrophysics Data System (ADS)
Liu, Kecheng
The web seems to have evolved from a syntactic web, a semantic web to a pragmatic web. This evolution conforms to the study of information and technology from the theory of semiotics. The pragmatics, concerning with the use of information in relation to the context and intended purposes, is extremely important in web service and applications. Much research in pragmatics has been carried out; but in the same time, attempts and solutions have led to some more questions. After reviewing the current work in pragmatic web, the paper presents a semiotic approach to website services, particularly on request decomposition and service aggregation.
Practical solutions to implementing "Born Semantic" data systems
NASA Astrophysics Data System (ADS)
Leadbetter, A.; Buck, J. J. H.; Stacey, P.
2015-12-01
The concept of data being "Born Semantic" has been proposed in recent years as a Semantic Web analogue to the idea of data being "born digital"[1], [2]. Within the "Born Semantic" concept, data are captured digitally and at a point close to the time of creation are annotated with markup terms from semantic web resources (controlled vocabularies, thesauri or ontologies). This allows heterogeneous data to be more easily ingested and amalgamated in near real-time due to the standards compliant annotation of the data. In taking the "Born Semantic" proposal from concept to operation, a number of difficulties have been encountered. For example, although there are recognised methods such as Header, Dictionary, Triples [3] for the compression, publication and dissemination of large volumes of triples these systems are not practical to deploy in the field on low-powered (both electrically and computationally) devices. Similarly, it is not practical for instruments to output fully formed semantically annotated data files if they are designed to be plugged into a modular system and the data to be centrally logged in the field as is the case on Argo floats and oceanographic gliders where internal bandwidth becomes an issue [2]. In light of these issues, this presentation will concentrate on pragmatic solutions being developed to the problem of generating Linked Data in near real-time systems. Specific examples from the European Commission SenseOCEAN project where Linked Data systems are being developed for autonomous underwater platforms, and from work being undertaken in the streaming of data from the Irish Galway Bay Cable Observatory initiative will be highlighted. Further, developments of a set of tools for the LogStash-ElasticSearch software ecosystem to allow the storing and retrieval of Linked Data will be introduced. References[1] A. Leadbetter & J. Fredericks, We have "born digital" - now what about "born semantic"?, European Geophysical Union General Assembly, 2014.[2] J. Buck & A. Leadbetter, Born semantic: linking data from sensors to users and balancing hardware limitations with data standards, European Geophysical Union General Assembly, 2015.[3] J. Fernandez et al., Binary RDF Representation for Publication and Exchange (HDT), Web Semantics 19:22-41, 2013.
NASA Astrophysics Data System (ADS)
Alvarez, Alejandro; Beche, Alexandre; Furano, Fabrizio; Hellmich, Martin; Keeble, Oliver; Rocha, Ricardo
2012-12-01
The Disk Pool Manager (DPM) is a lightweight solution for grid enabled disk storage management. Operated at more than 240 sites it has the widest distribution of all grid storage solutions in the WLCG infrastructure. It provides an easy way to manage and configure disk pools, and exposes multiple interfaces for data access (rfio, xroot, nfs, gridftp and http/dav) and control (srm). During the last year we have been working on providing stable, high performant data access to our storage system using standard protocols, while extending the storage management functionality and adapting both configuration and deployment procedures to reuse commonly used building blocks. In this contribution we cover in detail the extensive evaluation we have performed of our new HTTP/WebDAV and NFS 4.1 frontends, in terms of functionality and performance. We summarize the issues we faced and the solutions we developed to turn them into valid alternatives to the existing grid protocols - namely the additional work required to provide multi-stream transfers for high performance wide area access, support for third party copies, credential delegation or the required changes in the experiment and fabric management frameworks and tools. We describe new functionality that has been added to ease system administration, such as different filesystem weights and a faster disk drain, and new configuration and monitoring solutions based on the industry standards Puppet and Nagios. Finally, we explain some of the internal changes we had to do in the DPM architecture to better handle the additional load from the analysis use cases.
HTML5: a new standard for the Web.
Hoy, Matthew B
2011-01-01
HTML5 is the newest revision of the HTML standard developed by the World Wide Web Consortium (W3C). This new standard adds several exciting news features and capabilities to HTML. This article will briefly discuss the history of HTML standards, explore what changes are in the new HTML5 standard, and what implications it has for information professionals. A list of HTML5 resources and examples will also be provided.
Panoramic-image-based rendering solutions for visualizing remote locations via the web
NASA Astrophysics Data System (ADS)
Obeysekare, Upul R.; Egts, David; Bethmann, John
2000-05-01
With advances in panoramic image-based rendering techniques and the rapid expansion of web advertising, new techniques are emerging for visualizing remote locations on the WWW. Success of these techniques depends on how easy and inexpensive it is to develop a new type of web content that provides pseudo 3D visualization at home, 24-hours a day. Furthermore, the acceptance of this new visualization medium depends on the effectiveness of the familiarization tools by a segment of the population that was never exposed to this type of visualization. This paper addresses various hardware and software solutions available to collect, produce, and view panoramic content. While cost and effectiveness of building the content is being addressed using a few commercial hardware solutions, effectiveness of familiarization tools is evaluated using a few sample data sets.
Kwak, Dae Hyun; Lee, Eun Ju; Kim, Deug Joong
2014-11-01
Hydroxyapatite/cellulose acetate composite webs were fabricated by an electro-spinning process. This electro-spinning process makes it possible to fabricate complex three-dimensional shapes. Nano fibrous web consisting of cellulose acetate and hydroxyapatite was produced from their mixture solution by using an electro-spinning process under high voltage. The surface of the electro-spun fiber was modified by a plasma and alkaline solution in order to increase its bioactivity. The structure, morphology and properties of the electro-spun fibers were investigated and an in-vitro bioactivity test was evaluated in simulated body fluid (SBF). Bioactivity of the electro-spun web was enhanced with the filler concentration and surface treatment. The surface changes of electro-spun fibers modified by plasma and alkaline solution were investigated by FT-IR (Fourier Transform Infrared Spectroscopy) and XPS (X-ray Photoelectron Spectroscopy).
Incentives to Encourage Scientific Web Contribution (Invited)
NASA Astrophysics Data System (ADS)
Antunes, A. K.
2010-12-01
We suggest improvements to citation standards and creation of remuneration opportunities to encourage career scientist contributions to Web2.0 and social media science channels. At present, agencies want to accomplish better outreach and engagement with no funding, while scientists sacrifice their personal time to contribute to web and social media sites. Securing active participation by scientists requires career recognition of the value scientists provide to web knowledge bases and to the general public. One primary mechanism to encourage participation is citation standards, which let a contributor improve their reputation in a quantifiable way. But such standards must be recognized by their scientific and workplace communities. Using case studies such as the acceptance of web in the workplace and the growth of open access journals, we examine what agencies and individual can do as well as the time scales needed to secure increased active contribution by scientists. We also discuss ways to jumpstart this process.
A cross disciplinary study of link decay and the effectiveness of mitigation techniques
2013-01-01
Background The dynamic, decentralized world-wide-web has become an essential part of scientific research and communication. Researchers create thousands of web sites every year to share software, data and services. These valuable resources tend to disappear over time. The problem has been documented in many subject areas. Our goal is to conduct a cross-disciplinary investigation of the problem and test the effectiveness of existing remedies. Results We accessed 14,489 unique web pages found in the abstracts within Thomson Reuters' Web of Science citation index that were published between 1996 and 2010 and found that the median lifespan of these web pages was 9.3 years with 62% of them being archived. Survival analysis and logistic regression were used to find significant predictors of URL lifespan. The availability of a web page is most dependent on the time it is published and the top-level domain names. Similar statistical analysis revealed biases in current solutions: the Internet Archive favors web pages with fewer layers in the Universal Resource Locator (URL) while WebCite is significantly influenced by the source of publication. We also created a prototype for a process to submit web pages to the archives and increased coverage of our list of scientific webpages in the Internet Archive and WebCite by 22% and 255%, respectively. Conclusion Our results show that link decay continues to be a problem across different disciplines and that current solutions for static web pages are helping and can be improved. PMID:24266891
A cross disciplinary study of link decay and the effectiveness of mitigation techniques.
Hennessey, Jason; Ge, Steven
2013-01-01
The dynamic, decentralized world-wide-web has become an essential part of scientific research and communication. Researchers create thousands of web sites every year to share software, data and services. These valuable resources tend to disappear over time. The problem has been documented in many subject areas. Our goal is to conduct a cross-disciplinary investigation of the problem and test the effectiveness of existing remedies. We accessed 14,489 unique web pages found in the abstracts within Thomson Reuters' Web of Science citation index that were published between 1996 and 2010 and found that the median lifespan of these web pages was 9.3 years with 62% of them being archived. Survival analysis and logistic regression were used to find significant predictors of URL lifespan. The availability of a web page is most dependent on the time it is published and the top-level domain names. Similar statistical analysis revealed biases in current solutions: the Internet Archive favors web pages with fewer layers in the Universal Resource Locator (URL) while WebCite is significantly influenced by the source of publication. We also created a prototype for a process to submit web pages to the archives and increased coverage of our list of scientific webpages in the Internet Archive and WebCite by 22% and 255%, respectively. Our results show that link decay continues to be a problem across different disciplines and that current solutions for static web pages are helping and can be improved.
Nutrient-Mediated Architectural Plasticity of a Predatory Trap
Blamires, Sean J.; Tso, I-Min
2013-01-01
Background Nutrients such as protein may be actively sought by foraging animals. Many predators exhibit foraging plasticity, but how their foraging strategies are affected when faced with nutrient deprivation is largely unknown. In spiders, the assimilation of protein into silk may be in conflict with somatic processes so we predicted web building to be affected under protein depletion. Methodology/Principal Findings To assess the influence of protein intake on foraging plasticity we fed the orb-web spiders Argiope aemula and Cyclosa mulmeinensis high, low or no protein solutions over 10 days and allowed them to build webs. We compared post-feeding web architectural components and major ampullate (MA) silk amino acid compositions. We found that the number of radii in webs increased in both species when fed high protein solutions. Mesh size increased in A. aemula when fed a high protein solution. MA silk proline and alanine compositions varied in each species with contrasting variations in alanine between the two species. Glycine compositions only varied in C. mulmeinensis silk. No spiders significantly lost or gained mass on any feeding treatment, so they did not sacrifice somatic maintenance for amino acid investment in silk. Conclusions/Significance Our results show that the amount of protein taken in significantly affects the foraging decisions of trap-building predators, such as orb web spiders. Nevertheless, the subtle differences found between species in the association between protein intake, the amino acids invested in silk and web architectural plasticity show that the influence of protein deprivation on specific foraging strategies differs among different spiders. PMID:23349928
Real-time POD-CFD Wind-Load Calculator for PV Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huayamave, Victor; Divo, Eduardo; Ceballos, Andres
The primary objective of this project is to create an accurate web-based real-time wind-load calculator. This is of paramount importance for (1) the rapid and accurate assessments of the uplift and downforce loads on a PV mounting system, (2) identifying viable solutions from available mounting systems, and therefore helping reduce the cost of mounting hardware and installation. Wind loading calculations for structures are currently performed according to the American Society of Civil Engineers/ Structural Engineering Institute Standard ASCE/SEI 7; the values in this standard were calculated from simplified models that do not necessarily take into account relevant characteristics such asmore » those from full 3D effects, end effects, turbulence generation and dissipation, as well as minor effects derived from shear forces on installation brackets and other accessories. This standard does not include provisions that address the special requirements of rooftop PV systems, and attempts to apply this standard may lead to significant design errors as wind loads are incorrectly estimated. Therefore, an accurate calculator would be of paramount importance for the preliminary assessments of the uplift and downforce loads on a PV mounting system, identifying viable solutions from available mounting systems, and therefore helping reduce the cost of the mounting system and installation. The challenge is that although a full-fledged three-dimensional computational fluid dynamics (CFD) analysis would properly and accurately capture the complete physical effects of air flow over PV systems, it would be impractical for this tool, which is intended to be a real-time web-based calculator. CFD routinely requires enormous computation times to arrive at solutions that can be deemed accurate and grid-independent even in powerful and massively parallel computer platforms. This work is expected not only to accelerate solar deployment nationwide, but also help reach the SunShot Initiative goals of reducing the total installed cost of solar energy systems by 75%. The largest percentage of the total installed cost of solar energy system is associated with balance of system cost, with up to 40% going to “soft” costs; which include customer acquisition, financing, contracting, permitting, interconnection, inspection, installation, performance, operations, and maintenance. The calculator that is being developed will provide wind loads in real-time for any solar system designs and suggest the proper installation configuration and hardware; and therefore, it is anticipated to reduce system design, installation and permitting costs.« less
Interfaces to PeptideAtlas: a case study of standard data access systems
Handcock, Jeremy; Robinson, Thomas; Deutsch, Eric W.; Boyle, John
2012-01-01
Access to public data sets is important to the scientific community as a resource to develop new experiments or validate new data. Projects such as the PeptideAtlas, Ensembl and The Cancer Genome Atlas (TCGA) offer both access to public data and a repository to share their own data. Access to these data sets is often provided through a web page form and a web service API. Access technologies based on web protocols (e.g. http) have been in use for over a decade and are widely adopted across the industry for a variety of functions (e.g. search, commercial transactions, and social media). Each architecture adapts these technologies to provide users with tools to access and share data. Both commonly used web service technologies (e.g. REST and SOAP), and custom-built solutions over HTTP are utilized in providing access to research data. Providing multiple access points ensures that the community can access the data in the simplest and most effective manner for their particular needs. This article examines three common access mechanisms for web accessible data: BioMart, caBIG, and Google Data Sources. These are illustrated by implementing each over the PeptideAtlas repository and reviewed for their suitability based on specific usages common to research. BioMart, Google Data Sources, and caBIG are each suitable for certain uses. The tradeoffs made in the development of the technology are dependent on the uses each was designed for (e.g. security versus speed). This means that an understanding of specific requirements and tradeoffs is necessary before selecting the access technology. PMID:22941959
A System for Information Management in BioMedical Studies—SIMBioMS
Krestyaninova, Maria; Zarins, Andris; Viksna, Juris; Kurbatova, Natalja; Rucevskis, Peteris; Neogi, Sudeshna Guha; Gostev, Mike; Perheentupa, Teemu; Knuuttila, Juha; Barrett, Amy; Lappalainen, Ilkka; Rung, Johan; Podnieks, Karlis; Sarkans, Ugis; McCarthy, Mark I; Brazma, Alvis
2009-01-01
Summary: SIMBioMS is a web-based open source software system for managing data and information in biomedical studies. It provides a solution for the collection, storage, management and retrieval of information about research subjects and biomedical samples, as well as experimental data obtained using a range of high-throughput technologies, including gene expression, genotyping, proteomics and metabonomics. The system can easily be customized and has proven to be successful in several large-scale multi-site collaborative projects. It is compatible with emerging functional genomics data standards and provides data import and export in accepted standard formats. Protocols for transferring data to durable archives at the European Bioinformatics Institute have been implemented. Availability: The source code, documentation and initialization scripts are available at http://simbioms.org. Contact: support@simbioms.org; mariak@ebi.ac.uk PMID:19633095
Schuler, Thilo; Boeker, Martin; Klar, Rüdiger; Müller, Marcel
2007-01-01
The requirements of highly specialized clinical domains are often underrepresented in hospital information systems (HIS). Common consequences are that documentation remains to be paper-based or external systems with insufficient HIS integration are used. This paper presents a solution to overcome this deficiency in the form of a generic framework based on the HL7 Clinical Document Architecture. The central architectural idea is the definition of customized forms using a schema-controlled XML language. These flexible form definitions drive the user interface, the data storage, and standardized data exchange. A successful proof-of-concept application in a dermatologic outpatient wound care department has been implemented, and is well accepted by the clinicians. Our work with HL7 CDA revealed the need for further practical research in the health information standards realm.
NASA Technical Reports Server (NTRS)
Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.
2014-01-01
During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.
Web-based radiology: a future to be created.
Canadè, Adolfo; Palladino, Francesco; Pitzalis, Gianluca; Campioni, Paolo; Marano, Pasquale
2003-01-01
The impact of Internet on Medicine and Surgery is certainly remarkable, however the influence it had on Diagnostic Imaging was even stronger. The standardization of digital images acquired by the different medical imaging equipment has further facilitated the diffusion, transmission and communication in radiology within hospitals as well as on WEB. Radiology departments are bound to become "filmless" and with the present "tablet PC" radiological images will be directly transferred to the patient's bed in the relative electronic patient report. For radiology, interactive education could be envisaged with a tutor who guides the student(s) through the network. The Internet is an inexhaustible source of radiologic educational and information material with a number of sites of clinical cases, tutorial and teaching files, journals and magisterial lectures on-line. In a near future, the Internet could be applied in the simulation of clinicoradiologic cases or in applications of artificial intelligence with expert systems to support the solution of most complex cases.
NASA Technical Reports Server (NTRS)
Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.
2014-01-01
During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.
New Perspectives: Wave Mechanical Interpretations of Dark Matter, Baryon and Dark Energy
NASA Astrophysics Data System (ADS)
Russell, Esra
We model the cosmic components: dark matter, dark energy and baryon distributions in the Cosmic Web by means of highly nonlinear Schrodinger type and reaction diffusion type wave mechanical descriptions. The construction of these wave mechanical models of the structure formation is achieved by introducing the Fisher information measure and its comparison with highly nonlinear term which has dynamical analogy to infamous quantum potential in the wave equations. Strikingly, the comparison of this nonlinear term and the Fisher information measure provides a dynamical distinction between lack of self-organization and self-organization in the dynamical evolution of the cosmic components. Mathematically equivalent to the standard cosmic fluid equations, these approaches make it possible to follow the evolution of the matter distribution even into the highly nonlinear regime by circumventing singularities. Also, numerical realizations of the emerging web-like patterns are presented from the nonlinear dynamics of the baryon component while dark energy component shows Gaussian type dynamics corresponding to soliton-like solutions.
Web Implementation of Quality Assurance (QA) for X-ray Units in Balkanic Medical Institutions.
Urošević, Vlade; Ristić, Olga; Milošević, Danijela; Košutić, Duško
2015-08-01
Diagnostic radiology is the major contributor to the total dose of the population from all artificial sources. In order to reduce radiation exposure and optimize diagnostic x-ray image quality, it is necessary to increase the quality and efficiency of quality assurance (QA) and audit programs. This work presents a web application providing completely new QA solutions for x-ray modalities and facilities. The software gives complete online information (using European standards) with which the corresponding institutions and individuals can evaluate and control a facility's Radiation Safety and QA program. The software enables storage of all data in one place and sharing the same information (data), regardless of whether the measured data is used by an individual user or by an authorized institution. The software overcomes the distance and time separation of institutions and individuals who take part in QA. Upgrading the software will enable assessment of the medical exposure level to ionizing radiation.
Available, intuitive and free! Building e-learning modules using web 2.0 services.
Tam, Chun Wah Michael; Eastwood, Anne
2012-01-01
E-learning is part of the mainstream in medical education and often provides the most efficient and effective means of engaging learners in a particular topic. However, translating design and content ideas into a useable product can be technically challenging, especially in the absence of information technology (IT) support. There is little published literature on the use of web 2.0 services to build e-learning activities. To describe the web 2.0 tools and solutions employed to build the GP Synergy evidence-based medicine and critical appraisal online course. We used and integrated a number of free web 2.0 services including: Prezi, a web-based presentation platform; YouTube, a video sharing service; Google Docs, a online document platform; Tiny.cc, a URL shortening service; and Wordpress, a blogging platform. The course consisting of five multimedia-rich, tutorial-like modules was built without IT specialist assistance or specialised software. The web 2.0 services used were free. The course can be accessed with a modern web browser. Modern web 2.0 services remove many of the technical barriers for creating and sharing content on the internet. When used synergistically, these services can be a flexible and low-cost platform for building e-learning activities. They were a pragmatic solution in our context.
Visual Design Principles Applied To World Wide Web Construction.
ERIC Educational Resources Information Center
Luck, Donald D.; Hunter, J. Mark
This paper describes basic types of World Wide Web pages and presents design criteria for page layout based on principles of visual literacy. Discussion focuses on pages that present information in the following styles: billboard; directory/index; textual; and graphics. Problems and solutions in Web page construction are explored according to…
Introducing the Virtual Astronomy Multimedia Project
NASA Astrophysics Data System (ADS)
Wyatt, Ryan; Christensen, L. L.; Gauthier, A.; Hurt, R.
2008-05-01
The goal of the Virtual Astronomy Multimedia Project (VAMP) is to promote and vastly multiply the use of astronomy multimedia resources—from images and illustrations to animations, movies, and podcasts—and enable innovative future exploitation of a wide variety of outreach media by systematically linking resource archives worldwide. High-quality astronomical images, accompanied by rich caption and background information, abound on the web and yet prove notoriously difficult to locate efficiently using existing search tools. The Virtual Astronomy Multimedia Project offers a solution via the Astronomy Visualization Metadata (AVM) standard. Due to roll out in time for IYA2009, VAMP manages the design, implementation, and dissemination of the AVM standard for the education and public outreach astronomical imagery that observatories publish. VAMP will support implementations in World Wide Telescope, Google Sky, Portal to the Universe, and 365 Days of Astronomy, as well as Uniview and DigitalSky software designed specifically for planetariums. The VAMP workshop will introduce the AVM standard and describe its features, highlighting sample image tagging processes using diverse tools—the critical first step in getting media into VAMP. Participants with laptops will have an opportunity to experiment first hand, and workshop organizers will update a web page with system requirements and software options in advance of the conference (see http://virtualastronomy.org/ASP2008/ for links to resources). The workshop will also engage participants in a discussion and review of the innovative AVM image hierarchy taxonomy, which will soon be extended to other types of media.
2016-07-21
Todays internet has multiple webs. The surface web is what Google and other search engines index and pull based on links. Essentially, the surface...financial records, research and development), and personal data (medical records or legal documents). These are all deep web. Standard search engines dont
Life Cycle Project Plan Outline: Web Sites and Web-based Applications
This tool is a guideline for planning and checking for 508 compliance on web sites and web based applications. Determine which EIT components are covered or excepted, which 508 standards and requirements apply, and how to implement them.
Taking a traditional web site to patient portal technology.
Labow, Kimberly
2010-01-01
In this era of consumer-driven healthcare, consumers (your current and potential patients) seek healthcare information on the Internet. If your practice doesn't have a Web site, or has one that's static and uninformative, you won't be found, and the patient will move on to the next practice Web site. Why? Because only the most graphically appealing, informative, and patient-centered Web sites will drive patients to your practice. Patients are demanding improved communication with their physician. A practice Web site is a start, but the adoption of a fully functional, interactive Web site with patient portal solutions will not only improve patient-to-provider relationships but will also give the patient access to your practice from anywhere, at any time of the day. Furthermore, these solutions can help practices increase efficiencies and revenue, while reducing operating costs. With the American Recovery and Reinvestment Act of 2009 and other incentives for healthcare information technology adoption, the time is right for your practice to consider implementing technology that will bring considerable value to your practice and also increase patient satisfaction.
NASA Astrophysics Data System (ADS)
Buck, Justin; Leadbetter, Adam
2015-04-01
New users for the growing volume of ocean data for purposes such as 'big data' data products and operational data assimilation/ingestion require data to be readily ingestible. This can be achieved via the application of World Wide Web Consortium (W3C) Linked Data and Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) standards to data management. As part of several Horizons 2020 European projects (SenseOCEAN, ODIP, AtlantOS) the British Oceanographic Data Centre (BODC) are working on combining existing data centre architecture and SWE software such as Sensor Observation Services with a Linked Data front end. The standards to enable data delivery are proven and well documented1,2 There are practical difficulties when SWE standards are applied to real time data because of internal hardware bandwidth restrictions and a requirement to constrain data transmission costs. A pragmatic approach is proposed where sensor metadata and data output in OGC standards are implemented "shore-side" with sensors and instruments transmitting unique resolvable web linkages to persistent OGC SensorML records published at the BODC. References: 1. World Wide Web Consortium. (2013). Linked Data. Available: http://www.w3.org/standards/semanticweb/data. Last accessed 8th October 2014. 2. Open Geospatial Consortium. (2014). Sensor Web Enablement (SWE). Available: http://www.opengeospatial.org/ogc/markets-technologies/swe. Last accessed 8th October 2014.
Outsourced central archiving: an information bridge in a multi-IMAC environment
NASA Astrophysics Data System (ADS)
Gustavsson, Staffan; Tylen, Ulf; Carlsson, Goeran; Angelhed, Jan-Erik; Wintell, Mikael; Helmersson, Roger; Norrby, Clas
2001-08-01
In 1998 three hospitals merged to form the Sahlgrenska University Hospital. The total radiology production became 325 000 examinations per year. Two different PACS and RIS with different and incompatible archiving solutions were used since 1996. One PACS had commercial origin and the other was developed inhouse. Together they managed 1/3 of the total production. Due to differences in standard compliance and system architecture the communication was unsatisfactory. In order to improve efficiency, communication and the service level to our customers the situation was evaluated. It was decided to build a transparent virtual radiology department based on a modular approach. A common RIS and a central DICOM image archive as the central nodes in a star configured system were chosen. Web technique was chosen as the solution for distribution of images and reports. The reasons for the decisions as well as the present status of the installation are described and discussed is this paper.
A Routing Mechanism for Cloud Outsourcing of Medical Imaging Repositories.
Godinho, Tiago Marques; Viana-Ferreira, Carlos; Bastião Silva, Luís A; Costa, Carlos
2016-01-01
Web-based technologies have been increasingly used in picture archive and communication systems (PACS), in services related to storage, distribution, and visualization of medical images. Nowadays, many healthcare institutions are outsourcing their repositories to the cloud. However, managing communications between multiple geo-distributed locations is still challenging due to the complexity of dealing with huge volumes of data and bandwidth requirements. Moreover, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. In order to improve the performance of distributed medical imaging networks, a smart routing mechanism was developed. This includes an innovative cache system based on splitting and dynamic management of digital imaging and communications in medicine objects. The proposed solution was successfully deployed in a regional PACS archive. The results obtained proved that it is better than conventional approaches, as it reduces remote access latency and also the required cache storage space.
NASA Astrophysics Data System (ADS)
Arenas, Marcelo; Gutierrez, Claudio; Pérez, Jorge
The Resource Description Framework (RDF) is the standard data model for representing information about World Wide Web resources. In January 2008, it was released the recommendation of the W3C for querying RDF data, a query language called SPARQL. In this chapter, we give a detailed description of the semantics of this language. We start by focusing on the definition of a formal semantics for the core part of SPARQL, and then move to the definition for the entire language, including all the features in the specification of SPARQL by the W3C such as blank nodes in graph patterns and bag semantics for solutions.
EarthServer: Use of Rasdaman as a data store for use in visualisation of complex EO data
NASA Astrophysics Data System (ADS)
Clements, Oliver; Walker, Peter; Grant, Mike
2013-04-01
The European Commission FP7 project EarthServer is establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending cutting-edge Array Database technology. EarthServer is built around the Rasdaman Raster Data Manager which extends standard relational database systems with the ability to store and retrieve multi-dimensional raster data of unlimited size through an SQL style query language. Rasdaman facilitates visualisation of data by providing several Open Geospatial Consortium (OGC) standard interfaces through its web services wrapper, Petascope. These include the well established standards, Web Coverage Service (WCS) and Web Map Service (WMS) as well as the emerging standard, Web Coverage Processing Service (WCPS). The WCPS standard allows the running of ad-hoc queries on the data stored within Rasdaman, creating an infrastructure where users are not restricted by bandwidth when manipulating or querying huge datasets. Here we will show that the use of EarthServer technologies and infrastructure allows access and visualisation of massive scale data through a web client with only marginal bandwidth use as opposed to the current mechanism of copying huge amounts of data to create visualisations locally. For example if a user wanted to generate a plot of global average chlorophyll for a complete decade time series they would only have to download the result instead of Terabytes of data. Firstly we will present a brief overview of the capabilities of Rasdaman and the WCPS query language to introduce the ways in which it is used in a visualisation tool chain. We will show that there are several ways in which WCPS can be utilised to create both standard and novel web based visualisations. An example of a standard visualisation is the production of traditional 2d plots, allowing users the ability to plot data products easily. However, the query language allows the creation of novel/custom products, which can then immediately be plotted with the same system. For more complex multi-spectral data, WCPS allows the user to explore novel combinations of bands in standard band-ratio algorithms through a web browser with dynamic updating of the resultant image. To visualise very large datasets Rasdaman has the capability to dynamically scale a dataset or query result so that it can be appraised quickly for use in later unscaled queries. All of these techniques are accessible through a web based GIS interface increasing the number of potential users of the system. Lastly we will show the advances in dynamic web based 3D visualisations being explored within the EarthServer project. By utilising the emerging declarative 3D web standard X3DOM as a tool to visualise the results of WCPS queries we introduce several possible benefits, including quick appraisal of data for outliers or anomalous data points and visualisation of the uncertainty of data alongside the actual data values.
Advances on Sensor Web for Internet of Things
NASA Astrophysics Data System (ADS)
Liang, S.; Bermudez, L. E.; Huang, C.; Jazayeri, M.; Khalafbeigi, T.
2013-12-01
'In much the same way that HTML and HTTP enabled WWW, the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE), envisioned in 2001 [1] will allow sensor webs to become a reality.'. Due to the large number of sensor manufacturers and differing accompanying protocols, integrating diverse sensors into observation systems is not a simple task. A coherent infrastructure is needed to treat sensors in an interoperable, platform-independent and uniform way. SWE standardizes web service interfaces, sensor descriptions and data encodings as building blocks for a Sensor Web. SWE standards are now mature specifications (version 2.0) with approved OGC compliance test suites and tens of independent implementations. Many earth and space science organizations and government agencies are using the SWE standards to publish and share their sensors and observations. While SWE has been demonstrated very effective for scientific sensors, its complexity and the computational overhead may not be suitable for resource-constrained tiny sensors. In June 2012, a new OGC Standards Working Group (SWG) was formed called the Sensor Web Interface for Internet of Things (SWE-IoT) SWG. This SWG focuses on developing one or more OGC standards for resource-constrained sensors and actuators (e.g., Internet of Things devices) while leveraging the existing OGC SWE standards. In the near future, billions to trillions of small sensors and actuators will be embedded in real- world objects and connected to the Internet facilitating a concept called the Internet of Things (IoT). By populating our environment with real-world sensor-based devices, the IoT is opening the door to exciting possibilities for a variety of application domains, such as environmental monitoring, transportation and logistics, urban informatics, smart cities, as well as personal and social applications. The current SWE-IoT development aims on modeling the IoT components and defining a standard web service that makes the observations captured by IoT devices easily accessible and allows users to task the actuators on the IoT devices. The SWE IoT model links things with sensors and reuses the OGC Observation and Model (O&M) to link sensors with features of interest and observed properties Unlike most SWE standards, the SWE-IoT defines a RESTful web interface for users to perform CRUD (i.e., create, read, update, and delete) functions on resources, including Things, Sensors, Actuators, Observations, Tasks, etc. Inspired by the OASIS Open Data Protocol (OData), the SWE-IoT web service provides the multi-faceted query, which means that users can query from different entity collections and link from one entity to other related entities. This presentation will introduce the latest development of the OGC SWE-IoT standards. Potential applications and implications in Earth and Space science will also be discussed. [1] Mike Botts, Sensor Web Enablement White Paper, Open GIS Consortium, Inc. 2002
User-driven Cloud Implementation of environmental models and data for all
NASA Astrophysics Data System (ADS)
Gurney, R. J.; Percy, B. J.; Elkhatib, Y.; Blair, G. S.
2014-12-01
Environmental data and models come from disparate sources over a variety of geographical and temporal scales with different resolutions and data standards, often including terabytes of data and model simulations. Unfortunately, these data and models tend to remain solely within the custody of the private and public organisations which create the data, and the scientists who build models and generate results. Although many models and datasets are theoretically available to others, the lack of ease of access tends to keep them out of reach of many. We have developed an intuitive web-based tool that utilises environmental models and datasets located in a cloud to produce results that are appropriate to the user. Storyboards showing the interfaces and visualisations have been created for each of several exemplars. A library of virtual machine images has been prepared to serve these exemplars. Each virtual machine image has been tailored to run computer models appropriate to the end user. Two approaches have been used; first as RESTful web services conforming to the Open Geospatial Consortium (OGC) Web Processing Service (WPS) interface standard using the Python-based PyWPS; second, a MySQL database interrogated using PHP code. In all cases, the web client sends the server an HTTP GET request to execute the process with a number of parameter values and, once execution terminates, an XML or JSON response is sent back and parsed at the client side to extract the results. All web services are stateless, i.e. application state is not maintained by the server, reducing its operational overheads and simplifying infrastructure management tasks such as load balancing and failure recovery. A hybrid cloud solution has been used with models and data sited on both private and public clouds. The storyboards have been transformed into intuitive web interfaces at the client side using HTML, CSS and JavaScript, utilising plug-ins such as jQuery and Flot (for graphics), and Google Maps APIs. We have demonstrated that a cloud infrastructure can be used to assemble a virtual research environment that, coupled with a user-driven development approach, is able to cater to the needs of a wide range of user groups, from domain experts to concerned members of the general public.
Legaz-García, María del Carmen; Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2012-01-01
Linking Electronic Healthcare Records (EHR) content to educational materials has been considered a key international recommendation to enable clinical engagement and to promote patient safety. This would suggest citizens to access reliable information available on the web and to guide them properly. In this paper, we describe an approach in that direction, based on the use of dual model EHR standards and standardized educational contents. The recommendation method will be based on the semantic coverage of the learning content repository for a particular archetype, which will be calculated by applying semantic web technologies like ontologies and semantic annotations.
Giansanti, Daniele; Morelli, Sandra; Maccioni, Giovanni; Guerriero, Lorenzo; Bedini, Remo; Pepe, Gennaro; Colombo, Cesare; Borghi, Gabriella; Macellari, Velio
2009-01-01
Due to major advances in the information technology, telemedicine applications are ready for a widespread use. Nonetheless, to allow their diffusion in National Health Care Systems (NHCSs) specific methodologies of health technology assessment (HTA) should be used to assess the standardization, the overall quality, the interoperability, the addressing to legal, economic and cost benefit aspects. One of the limits to the diffusion of the digital tele-echocardiography (T-E) applications in the NHCS lacking of a specific methodology for the HTA. In the present study, a solution offering a structured HTA of T-E products was designed. The methodology assured also the definition of standardized quality levels for the application. The first level represents the minimum level of acceptance; the other levels are accessory levels useful for a more accurate assessment of the product. The methodology showed to be useful to rationalize the process of standardization and has received a high degree of acceptance by the subjects involved in the study.
Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things.
Jazayeri, Mohammad Ali; Liang, Steve H L; Huang, Chih-Yuan
2015-09-22
Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision.
ERIC Educational Resources Information Center
Noel-Levitz, Inc, 2009
2009-01-01
Have you updated your Web site today? Is it possible that answering "yes" to this simple question is the key to the success of your marketing and recruiting efforts? In the current recruitment arena, the ability to update and maintain this one high-value asset (your Web site) might be the key to the potency of your institutional…
NASA Astrophysics Data System (ADS)
Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.
2010-12-01
Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.
Supporting the Application of Design Patterns in Web-Course Design.
ERIC Educational Resources Information Center
Frizell, Sherri S.; Hubscher, Roland
Many instructors are expected to design and create Web courses. The design of Web courses can be a difficult task for educators who lack experience in interaction and instructional design. Design patterns have emerged as a way to capture design experience and present design solutions to novice designers. Design patterns are a widely accepted…
Cellulosic fibers and nonwovens from solutions: Processing and properties
NASA Astrophysics Data System (ADS)
Dahiya, Atul
Cellulose is a renewable and bio-based material source extracted from wood that has the potential to generate value added products such as composites, fibers, and nonwoven textiles. This research was focused on the potential of cellulose as the raw material for fiber spinning and melt blowing of nonwovens. The cellulose was dissolved in two different benign solvents: the amine oxide 4-N-methyl morpholine oxide monohydrate (NMMO•H2O) (lyocell process); and the ionic liquid (IL) 1-butyl-3-methylimidazolium chloride ([C 4MIM]Cl). The solvents have essentially no vapor pressure and are biologically degradable, making them environmentally advantageous for manufacturing processes. The objectives of this research were to: (1) characterize solutions of NMMO and [C4MIM]Cl; (2) develop processing techniques to melt blow nonwoven webs from cellulose using NMMO as a solvent; (3) electrospin cellulosic fibers from the [C4MIM]Cl solvent; (4) spin cellulosic single fibers from the [C4MIM]Cl solvent. Different concentration solutions of cellulose in NMMO and [C4MIM]Cl were initially characterized rheologically and thermally to understand their behavior under different conditions of stress, strain, and temperature. Results were used to determine processing conditions and concentrations for the melt blowing, fiber spinning, and electrospinning experiments. The cellulosic nonwoven webs and fibers were characterized for their physical and optical properties such as tensile strength, water absorbency, fiber diameter, and fiber surface. Thermal properties were also measured by thermogravimetric analysis, differential scanning calorimetry, and dynamic mechanical analysis. Lyocell webs were successfully melt blown from the 14% cellulose solution. Basis weights of the webs were 27, 79, and 141 g/m2 and thicknesses ranged from 0.3-0.9 mm, depending on die temperatures and die to collector distance. The average fiber diameter achieved was 2.3 microns. The 6% lyocell solutions exhibited poor spinability and did not form nonwoven webs. The electrospun nonwoven webs obtained were evaluated for fiber diameter and surface/web structure using scanning electron microscopy (SEM). The fibers obtained were in the range of 17-25 microns and the fiber surfaces and shapes varied with spinning conditions. A capillary rheometer was used to spin single fibers from [C 4MIM]Cl. Circular fibers in diameter ranging from 12-84 microns were obtained.
This May 2003 document contains questions and answers on the Paper and Web Coating National Emission Standards for Hazardous Air Pollutants (NESHAP) regulation. The questions cover topics such as compliance, applicability, and initial notification.
40 CFR 63.3300 - Which of my emission sources are affected by this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... CATEGORIES National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating What This... subject to this subpart is the collection of all web coating lines at your facility. This includes web coating lines engaged in the coating of metal webs that are used in flexible packaging, and web coating...
EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's Web based information assests. EPA's Web Taxonomy is being provided in Simple Knowledge Organization System (SKOS) format. SKOS is a standard for sharing and linking knowledge organization systems that promises to make Federal terminology resources more interoperable.
40 CFR 63.3300 - Which of my emission sources are affected by this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... CATEGORIES National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating What This... subject to this subpart is the collection of all web coating lines at your facility. This includes web coating lines engaged in the coating of metal webs that are used in flexible packaging, and web coating...
OpenID Connect as a security service in cloud-based medical imaging systems.
Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter
2016-04-01
The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.
Spatial data standards meet meteorological data - pushing the boundaries
NASA Astrophysics Data System (ADS)
Wagemann, Julia; Siemen, Stephan; Lamy-Thepaut, Sylvie
2017-04-01
The data archive of the European Centre for Medium-Range Weather Forecasts (ECMWF) holds around 120 PB of data and is world's largest archive of meteorological data. This information is of great value for many Earth Science disciplines, but the complexity of the data (up to five dimensions and different time axis domains) and its native data format GRIB, while being an efficient archive format, limits the overall data uptake especially from users outside the MetOcean domain. ECMWF's MARS WebAPI is a very efficient and flexible system for expert users to access and retrieve meteorological data, though challenging for users outside the MetOcean domain. With the help of web-based standards for data access and processing, ECMWF wants to make more than 1 PB of meteorological and climate data easier accessible to users across different Earth Science disciplines. As climate data provider for the H2020 project EarthServer-2, ECMWF explores the feasibility to give on-demand access to it's MARS archive via the OGC standard interface Web Coverage Service (WCS). Despite the potential a WCS for climate and meteorological data offers, the standards-based modelling of meteorological and climate data entails many challenges and reveals the boundaries of the current Web Coverage Service 2.0 standard. Challenges range from valid semantic data models for meteorological data to optimal and efficient data structures for a scalable web service. The presentation reviews the applicability of the current Web Coverage Service 2.0 standard to meteorological and climate data and discusses challenges that are necessary to overcome in order to achieve real interoperability and to ensure the conformant sharing and exchange of meteorological data.
Moreira, Adriano; Lungenstrass, Tomás; Lu, Wei-Chung; Seco, Fernando; Nicolau, Maria João; Farina, Joaquín; Morales, Juan Pablo; Lu, Wen-Chen; Cheng, Ho-Ti; Yang, Shi-Shen
2018-01-01
The development of indoor positioning solutions using smartphones is a growing activity with an enormous potential for everyday life and professional applications. The research activities on this topic concentrate on the development of new positioning solutions that are tested in specific environments under their own evaluation metrics. To explore the real positioning quality of smartphone-based solutions and their capabilities for seamlessly adapting to different scenarios, it is needed to find fair evaluation frameworks. The design of competitions using extensive pre-recorded datasets is a valid way to generate open data for comparing the different solutions created by research teams. In this paper, we discuss the details of the 2017 IPIN indoor localization competition, the different datasets created, the teams participating in the event, and the results they obtained. We compare these results with other competition-based approaches (Microsoft and Perf-loc) and on-line evaluation web sites. The lessons learned by organising these competitions and the benefits for the community are addressed along the paper. Our analysis paves the way for future developments on the standardization of evaluations and for creating a widely-adopted benchmark strategy for researchers and companies in the field. PMID:29415508
Torres-Sospedra, Joaquín; Jiménez, Antonio R; Moreira, Adriano; Lungenstrass, Tomás; Lu, Wei-Chung; Knauth, Stefan; Mendoza-Silva, Germán Martín; Seco, Fernando; Pérez-Navarro, Antoni; Nicolau, Maria João; Costa, António; Meneses, Filipe; Farina, Joaquín; Morales, Juan Pablo; Lu, Wen-Chen; Cheng, Ho-Ti; Yang, Shi-Shen; Fang, Shih-Hau; Chien, Ying-Ren; Tsao, Yu
2018-02-06
The development of indoor positioning solutions using smartphones is a growing activity with an enormous potential for everyday life and professional applications. The research activities on this topic concentrate on the development of new positioning solutions that are tested in specific environments under their own evaluation metrics. To explore the real positioning quality of smartphone-based solutions and their capabilities for seamlessly adapting to different scenarios, it is needed to find fair evaluation frameworks. The design of competitions using extensive pre-recorded datasets is a valid way to generate open data for comparing the different solutions created by research teams. In this paper, we discuss the details of the 2017 IPIN indoor localization competition, the different datasets created, the teams participating in the event, and the results they obtained. We compare these results with other competition-based approaches (Microsoft and Perf-loc) and on-line evaluation web sites. The lessons learned by organising these competitions and the benefits for the community are addressed along the paper. Our analysis paves the way for future developments on the standardization of evaluations and for creating a widely-adopted benchmark strategy for researchers and companies in the field.
NASA Technical Reports Server (NTRS)
Habermann, Ted; Gallagher, James; Jelenak, Aleksandar; Potter, Nathan; Lee, Joe; Yang, Kent
2017-01-01
This study explored three candidate architectures with different types of objects and access paths for serving NASA Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the study were: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and netCDF4 data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3). Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.We will describe the three architectures and the use cases along with performance results and recommendations for further work.
78 FR 42775 - CGI Federal, Inc., and Custom Applications Management; Transfer of Data
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-17
... develop applications, Web sites, Web pages, web-based applications and databases, in accordance with EPA policies and related Federal standards and procedures. The Contractor will provide [[Page 42776
Current state of web accessibility of Malaysian ministries websites
NASA Astrophysics Data System (ADS)
Ahmi, Aidi; Mohamad, Rosli
2016-08-01
Despite the fact that Malaysian public institutions have progressed considerably on website and portal usage, web accessibility has been reported as one of the issues deserves special attention. Consistent with the government moves to promote an effective use of web and portal, it is essential for the government institutions to ensure compliance with established standards and guidelines on web accessibility. This paper evaluates accessibility of 25 Malaysian ministries websites using automated tools i.e. WAVE and Achecker. Both tools are designed to objectively evaluate web accessibility in conformance with Web Content Accessibility Guidelines 2.0 (WCAG 2.0) and United States Rehabilitation Act 1973 (Section 508). The findings reported somewhat low compliance to web accessibility standard amongst the ministries. Further enhancement is needed in the aspect of input elements such as label and checkbox to be associated with text as well as image-related elements. This findings could be used as a mechanism for webmasters to locate and rectify errors pertaining to the web accessibility and to ensure equal access of the web information and services to all citizen.
The EMBRACE web service collection
Pettifer, Steve; Ison, Jon; Kalaš, Matúš; Thorne, Dave; McDermott, Philip; Jonassen, Inge; Liaquat, Ali; Fernández, José M.; Rodriguez, Jose M.; Partners, INB-; Pisano, David G.; Blanchet, Christophe; Uludag, Mahmut; Rice, Peter; Bartaseviciute, Edita; Rapacki, Kristoffer; Hekkelman, Maarten; Sand, Olivier; Stockinger, Heinz; Clegg, Andrew B.; Bongcam-Rudloff, Erik; Salzemann, Jean; Breton, Vincent; Attwood, Teresa K.; Cameron, Graham; Vriend, Gert
2010-01-01
The EMBRACE (European Model for Bioinformatics Research and Community Education) web service collection is the culmination of a 5-year project that set out to investigate issues involved in developing and deploying web services for use in the life sciences. The project concluded that in order for web services to achieve widespread adoption, standards must be defined for the choice of web service technology, for semantically annotating both service function and the data exchanged, and a mechanism for discovering services must be provided. Building on this, the project developed: EDAM, an ontology for describing life science web services; BioXSD, a schema for exchanging data between services; and a centralized registry (http://www.embraceregistry.net) that collects together around 1000 services developed by the consortium partners. This article presents the current status of the collection and its associated recommendations and standards definitions. PMID:20462862
Online plot services for paleomagnetism and rock magnetism
NASA Astrophysics Data System (ADS)
Hatakeyama, T.
2017-12-01
In paleomagnetism and rock magnetism, a lot of types of original plots are used for obtained data from measurements. Many researchers in paleomagnetism often use not only general-purpose plotting programs such as Microsoft Excel but also single-purpose tools. A large benefit of using the latter tools is that we can make a beautiful figure for our own data. However, those programs require specific environment for their operation such as type of hardware and platform, type of operation system and its version, libraries for execution and so on. Therefore, it is difficult to share the result and graphics among the collaborators who use different environments on their PCs. Thus, one of the best solution is likely a program operated on popular environment. The most popular is web environment as we all know. Almost all current operating systems have web browsers as standard and all people use them regularly. Now we provide a web-based service plotting paleomagnetic results easily.We develop original programs with a command-line user interface (non-GUI), and we prepared web pages for input of the simple measured data and options and a wrapper script which transfers the entered values to the program. The results, analyzed values and plotted graphs from the program are shown in the HTML page and downloadable. Our plot services are provided in http://mage-p.org/mageplot/. In this talk, we introduce our program and service and discuss the philosophy and efficiency of these services.
Sharing on Web 3d Models of Ancient Theatres. a Methodological Workflow
NASA Astrophysics Data System (ADS)
Scianna, A.; La Guardia, M.; Scaduto, M. L.
2016-06-01
In the last few years, the need to share on the Web the knowledge of Cultural Heritage (CH) through navigable 3D models has increased. This need requires the availability of Web-based virtual reality systems and 3D WEBGIS. In order to make the information available to all stakeholders, these instruments should be powerful and at the same time very user-friendly. However, research and experiments carried out so far show that a standardized methodology doesn't exist. All this is due both to complexity and dimensions of geometric models to be published, on the one hand, and to excessive costs of hardware and software tools, on the other. In light of this background, the paper describes a methodological approach for creating 3D models of CH, freely exportable on the Web, based on HTML5 and free and open source software. HTML5, supporting the WebGL standard, allows the exploration of 3D spatial models using most used Web browsers like Chrome, Firefox, Safari, Internet Explorer. The methodological workflow here described has been tested for the construction of a multimedia geo-spatial platform developed for three-dimensional exploration and documentation of the ancient theatres of Segesta and of Carthage, and the surrounding landscapes. The experimental application has allowed us to explore the potential and limitations of sharing on the Web of 3D CH models based on WebGL standard. Sharing capabilities could be extended defining suitable geospatial Web-services based on capabilities of HTML5 and WebGL technology.
SensorWeb 3G: Extending On-Orbit Sensor Capabilities to Enable Near Realtime User Configurability
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Tran, Daniel; Davies, Ashley; Sullivan, Don; Ames, Troy;
2010-01-01
This research effort prototypes an implementation of a standard interface, Web Coverage Processing Service (WCPS), which is an Open Geospatial Consortium(OGC) standard, to enable users to define, test, upload and execute algorithms for on-orbit sensor systems. The user is able to customize on-orbit data products that result from raw data streaming from an instrument. This extends the SensorWeb 2.0 concept that was developed under a previous Advanced Information System Technology (AIST) effort in which web services wrap sensors and a standardized Extensible Markup Language (XML) based scripting workflow language orchestrates processing steps across multiple domains. SensorWeb 3G extends the concept by providing the user controls into the flight software modules associated with on-orbit sensor and thus provides a degree of flexibility which does not presently exist. The successful demonstrations to date will be presented, which includes a realistic HyspIRI decadal mission testbed. Furthermore, benchmarks that were run will also be presented along with future demonstration and benchmark tests planned. Finally, we conclude with implications for the future and how this concept dovetails into efforts to develop "cloud computing" methods and standards.
Data partitioning enables the use of standard SOAP Web Services in genome-scale workflows.
Sztromwasser, Pawel; Puntervoll, Pål; Petersen, Kjell
2011-07-26
Biological databases and computational biology tools are provided by research groups around the world, and made accessible on the Web. Combining these resources is a common practice in bioinformatics, but integration of heterogeneous and often distributed tools and datasets can be challenging. To date, this challenge has been commonly addressed in a pragmatic way, by tedious and error-prone scripting. Recently however a more reliable technique has been identified and proposed as the platform that would tie together bioinformatics resources, namely Web Services. In the last decade the Web Services have spread wide in bioinformatics, and earned the title of recommended technology. However, in the era of high-throughput experimentation, a major concern regarding Web Services is their ability to handle large-scale data traffic. We propose a stream-like communication pattern for standard SOAP Web Services, that enables efficient flow of large data traffic between a workflow orchestrator and Web Services. We evaluated the data-partitioning strategy by comparing it with typical communication patterns on an example pipeline for genomic sequence annotation. The results show that data-partitioning lowers resource demands of services and increases their throughput, which in consequence allows to execute in-silico experiments on genome-scale, using standard SOAP Web Services and workflows. As a proof-of-principle we annotated an RNA-seq dataset using a plain BPEL workflow engine.
Food and beverage advertising on children's web sites.
Ustjanauskas, A E; Harris, J L; Schwartz, M B
2014-10-01
Food marketing contributes to childhood obesity. Food companies commonly place display advertising on children's web sites, but few studies have investigated this form of advertising. Document the number of food and beverage display advertisements viewed on popular children's web sites, nutritional quality of advertised brands and proportion of advertising approved by food companies as healthier dietary choices for child-directed advertising. Syndicated Internet exposure data identified popular children's web sites and food advertisements viewed on these web sites from July 2009 through June 2010. Advertisements were classified according to food category and companies' participation in food industry self-regulation. The percent of advertisements meeting government-proposed nutrition standards was calculated. 3.4 billion food advertisements appeared on popular children's web sites; 83% on just four web sites. Breakfast cereals and fast food were advertised most often (64% of ads). Most ads (74%) promoted brands approved by companies for child-directed advertising, but 84% advertised products that were high in fat, sugar and/or sodium. Ads for foods designated by companies as healthier dietary choices appropriate for child-directed advertising were least likely to meet independent nutrition standards. Most foods advertised on popular children's web sites do not meet independent nutrition standards. Further improvements to industry self-regulation are required. © 2013 The Authors. Pediatric Obesity © 2013 International Association for the Study of Obesity.
An improvement analysis on video compression using file segmentation
NASA Astrophysics Data System (ADS)
Sharma, Shubhankar; Singh, K. John; Priya, M.
2017-11-01
From the past two decades the extreme evolution of the Internet has lead a massive rise in video technology and significantly video consumption over the Internet which inhabits the bulk of data traffic in general. Clearly, video consumes that so much data size on the World Wide Web, to reduce the burden on the Internet and deduction of bandwidth consume by video so that the user can easily access the video data.For this, many video codecs are developed such as HEVC/H.265 and V9. Although after seeing codec like this one gets a dilemma of which would be improved technology in the manner of rate distortion and the coding standard.This paper gives a solution about the difficulty for getting low delay in video compression and video application e.g. ad-hoc video conferencing/streaming or observation by surveillance. Also this paper describes the benchmark of HEVC and V9 technique of video compression on subjective oral estimations of High Definition video content, playback on web browsers. Moreover, this gives the experimental ideology of dividing the video file into several segments for compression and putting back together to improve the efficiency of video compression on the web as well as on the offline mode.
Web servlet-assisted, dial-in flow cytometry data analysis.
Battye, F
2001-02-01
The obvious benefits of centralized data storage notwithstanding, the size of modern flow cytometry data files discourages their transmission over commonly used telephone modem connections. The proposed solution is to install at the central location a web servlet that can extract compact data arrays, of a form dependent on the requested display type, from the stored files and transmit them to a remote client computer program for display. A client program and a web servlet, both written in the Java programming language, were designed to communicate over standard network connections. The client program creates familiar numerical and graphical display types and allows the creation of gates from combinations of user-defined regions. Data compression techniques further reduce transmission times for data arrays that are already much smaller than the data file itself. For typical data files, network transmission times were reduced more than 700-fold for extraction of one-dimensional (1-D) histograms, between 18 and 120-fold for 2-D histograms, and 6-fold for color-coded dot plots. Numerous display formats are possible without further access to the data file. This scheme enables telephone modem access to centrally stored data without restricting flexibility of display format or preventing comparisons with locally stored files. Copyright 2001 Wiley-Liss, Inc.
Jiang, Guoqian; Solbrig, Harold R; Chute, Christopher G
2011-01-01
A source of semantically coded Adverse Drug Event (ADE) data can be useful for identifying common phenotypes related to ADEs. We proposed a comprehensive framework for building a standardized ADE knowledge base (called ADEpedia) through combining ontology-based approach with semantic web technology. The framework comprises four primary modules: 1) an XML2RDF transformation module; 2) a data normalization module based on NCBO Open Biomedical Annotator; 3) a RDF store based persistence module; and 4) a front-end module based on a Semantic Wiki for the review and curation. A prototype is successfully implemented to demonstrate the capability of the system to integrate multiple drug data and ontology resources and open web services for the ADE data standardization. A preliminary evaluation is performed to demonstrate the usefulness of the system, including the performance of the NCBO annotator. In conclusion, the semantic web technology provides a highly scalable framework for ADE data source integration and standard query service.
An optimized web-based approach for collaborative stereoscopic medical visualization
Kaspar, Mathias; Parsad, Nigel M; Silverstein, Jonathan C
2013-01-01
Objective Medical visualization tools have traditionally been constrained to tethered imaging workstations or proprietary client viewers, typically part of hospital radiology systems. To improve accessibility to real-time, remote, interactive, stereoscopic visualization and to enable collaboration among multiple viewing locations, we developed an open source approach requiring only a standard web browser with no added client-side software. Materials and Methods Our collaborative, web-based, stereoscopic, visualization system, CoWebViz, has been used successfully for the past 2 years at the University of Chicago to teach immersive virtual anatomy classes. It is a server application that streams server-side visualization applications to client front-ends, comprised solely of a standard web browser with no added software. Results We describe optimization considerations, usability, and performance results, which make CoWebViz practical for broad clinical use. We clarify technical advances including: enhanced threaded architecture, optimized visualization distribution algorithms, a wide range of supported stereoscopic presentation technologies, and the salient theoretical and empirical network parameters that affect our web-based visualization approach. Discussion The implementations demonstrate usability and performance benefits of a simple web-based approach for complex clinical visualization scenarios. Using this approach overcomes technical challenges that require third-party web browser plug-ins, resulting in the most lightweight client. Conclusions Compared to special software and hardware deployments, unmodified web browsers enhance remote user accessibility to interactive medical visualization. Whereas local hardware and software deployments may provide better interactivity than remote applications, our implementation demonstrates that a simplified, stable, client approach using standard web browsers is sufficient for high quality three-dimensional, stereoscopic, collaborative and interactive visualization. PMID:23048008
WEBTAS Software Life Cycle Development
2006-09-01
may be published in both html and pdf formats via menu selection. Adobe® FrameMaker ® 7.1 and Quadralay Corporation WebWorks® Professional 2003...X X WebTAS 2.5.3 ISAM X X WebTAS 2.5.3 Domain Editor Guide X X 13 The backbone of the ISS publishing environment consists of Adobe® FrameMaker ...and WebWorks® Publisher Professional 2003. FrameMaker ® provides an enterprise-class authoring and publishing solution that combines the
WebSat--a web software for microsatellite marker development.
Martins, Wellington Santos; Lucas, Divino César Soares; Neves, Kelligton Fabricio de Souza; Bertioli, David John
2009-01-01
Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. The web tool may be accessed at http://purl.oclc.org/NET/websat/
Okamura, Kyoko; Bernstein, Judith; Fidler, Anne T
2002-01-01
The Internet has become a major source of health information for women, but information placed on the World Wide Web does not routinely undergo a peer review process before dissemination. In this study, we present an analysis of 197 infertility-related Web sites for quality and accountability, using JAMA's minimal core standards for responsible print. Only 2% of the web sites analyzed met all four recommended standards, and 50.8% failed to report any of the four. Commercial web sites were more likely to fail to meet minimum standards (71.2%) than those with educational (46.8%) or supportive (29.8%) elements. Web sites with educational and informational components were most common (70.6%), followed by commercial sites (52.8%) and sites that offered a forum for infertility support and activism (28.9%). Internet resources available to infertile patients are at best variable. The current state of infertility-related materials on the World Wide Web offers unprecedented opportunities to improve services to a growing number of e-health users. Because of variations in quality of site content, women's health clinicians must assume responsibility for a new role as information monitor. This study provides assessment tools clinicians can apply and share with clients.
2006-06-01
IEC Web Site - http://www.iec.org/ National Instruments Web Site - http://www.ni.com/ ASA ( Acoustical Society of America) - http://asa.aip.org/ Flow...1994 (R2004), Acoustical Terminology. ANSI S1.10-1966 (R2001), USA Standard Method for Calibration of Microphones. ANSI S1.15-1997, USA Standard...R2001), American National Standard Specification for Acoustical Calibrators. ANSI S1.9-1996 (R2001), American National Standard Instruments for
Work of the Web Weavers: Web Development in Academic Libraries
ERIC Educational Resources Information Center
Bundza, Maira; Vander Meer, Patricia Fravel; Perez-Stable, Maria A.
2009-01-01
Although the library's Web site has become a standard tool for seeking information and conducting research in academic institutions, there are a variety of ways libraries approach the often challenging--and sometimes daunting--process of Web site development and maintenance. Three librarians at Western Michigan University explored issues related…
Web-based hydrodynamics computing
NASA Astrophysics Data System (ADS)
Shimoide, Alan; Lin, Luping; Hong, Tracie-Lynne; Yoon, Ilmi; Aragon, Sergio R.
2005-01-01
Proteins are long chains of amino acids that have a definite 3-d conformation and the shape of each protein is vital to its function. Since proteins are normally in solution, hydrodynamics (describes the movement of solvent around a protein as a function of shape and size of the molecule) can be used to probe the size and shape of proteins compared to those derived from X-ray crystallography. The computation chain needed for these hydrodynamics calculations consists of several separate programs by different authors on various platforms and often requires 3D visualizations of intermediate results. Due to the complexity, tools developed by a particular research group are not readily available for use by other groups, nor even by the non-experts within the same research group. To alleviate this situation, and to foment the easy and wide distribution of computational tools worldwide, we developed a web based interactive computational environment (WICE) including interactive 3D visualization that can be used with any web browser. Java based technologies were used to provide a platform neutral, user-friendly solution. Java Server Pages (JSP), Java Servlets, Java Beans, JOGL (Java bindings for OpenGL), and Java Web Start were used to create a solution that simplifies the computing chain for the user allowing the user to focus on their scientific research. WICE hides complexity from the user and provides robust and sophisticated visualization through a web browser.
Web-based hydrodynamics computing
NASA Astrophysics Data System (ADS)
Shimoide, Alan; Lin, Luping; Hong, Tracie-Lynne; Yoon, Ilmi; Aragon, Sergio R.
2004-12-01
Proteins are long chains of amino acids that have a definite 3-d conformation and the shape of each protein is vital to its function. Since proteins are normally in solution, hydrodynamics (describes the movement of solvent around a protein as a function of shape and size of the molecule) can be used to probe the size and shape of proteins compared to those derived from X-ray crystallography. The computation chain needed for these hydrodynamics calculations consists of several separate programs by different authors on various platforms and often requires 3D visualizations of intermediate results. Due to the complexity, tools developed by a particular research group are not readily available for use by other groups, nor even by the non-experts within the same research group. To alleviate this situation, and to foment the easy and wide distribution of computational tools worldwide, we developed a web based interactive computational environment (WICE) including interactive 3D visualization that can be used with any web browser. Java based technologies were used to provide a platform neutral, user-friendly solution. Java Server Pages (JSP), Java Servlets, Java Beans, JOGL (Java bindings for OpenGL), and Java Web Start were used to create a solution that simplifies the computing chain for the user allowing the user to focus on their scientific research. WICE hides complexity from the user and provides robust and sophisticated visualization through a web browser.
NASA Astrophysics Data System (ADS)
Paulraj, D.; Swamynathan, S.; Madhaiyan, M.
2012-11-01
Web Service composition has become indispensable as a single web service cannot satisfy complex functional requirements. Composition of services has received much interest to support business-to-business (B2B) or enterprise application integration. An important component of the service composition is the discovery of relevant services. In Semantic Web Services (SWS), service discovery is generally achieved by using service profile of Ontology Web Languages for Services (OWL-S). The profile of the service is a derived and concise description but not a functional part of the service. The information contained in the service profile is sufficient for atomic service discovery, but it is not sufficient for the discovery of composite semantic web services (CSWS). The purpose of this article is two-fold: first to prove that the process model is a better choice than the service profile for service discovery. Second, to facilitate the composition of inter-organisational CSWS by proposing a new composition method which uses process ontology. The proposed service composition approach uses an algorithm which performs a fine grained match at the level of atomic process rather than at the level of the entire service in a composite semantic web service. Many works carried out in this area have proposed solutions only for the composition of atomic services and this article proposes a solution for the composition of composite semantic web services.
16 CFR 1031.18 - Method of review and comment.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Commission staff is involved shall have a unique Web link on the Commission Web site with relevant... forwarded to appropriate staff for consideration and/or response. (c) On the voluntary standards Web site...
16 CFR 1031.18 - Method of review and comment.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Commission staff is involved shall have a unique Web link on the Commission Web site with relevant... forwarded to appropriate staff for consideration and/or response. (c) On the voluntary standards Web site...
16 CFR 1031.18 - Method of review and comment.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Commission staff is involved shall have a unique Web link on the Commission Web site with relevant... forwarded to appropriate staff for consideration and/or response. (c) On the voluntary standards Web site...
Providing Assistive Technology Applications as a Service Through Cloud Computing.
Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio
2015-01-01
Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.
NASA Technical Reports Server (NTRS)
2001-01-01
REI Systems, Inc. developed a software solution that uses the Internet to eliminate the paperwork typically required to document and manage complex business processes. The data management solution, called Electronic Handbooks (EHBs), is presently used for the entire SBIR program processes at NASA. The EHB-based system is ideal for programs and projects whose users are geographically distributed and are involved in complex management processes and procedures. EHBs provide flexible access control and increased communications while maintaining security for systems of all sizes. Through Internet Protocol- based access, user authentication and user-based access restrictions, role-based access control, and encryption/decryption, EHBs provide the level of security required for confidential data transfer. EHBs contain electronic forms and menus, which can be used in real time to execute the described processes. EHBs use standard word processors that generate ASCII HTML code to set up electronic forms that are viewed within a web browser. EHBs require no end-user software distribution, significantly reducing operating costs. Each interactive handbook simulates a hard-copy version containing chapters with descriptions of participants' roles in the online process.
The Environmental Assessment and Management (TEAM) Guide: Iowa Supplement
2010-02-01
limited to, paper shredding, copying, photographic activities, and blueprinting machines. This does not include incinerators. r. Laundry dryers ...exemptions from these standards may apply. (Part 63, Subpart IIII) cj. Emission standards for hazardous air pollutants: paper and other web ...coating. This standard applies to a facility that is engaged in the coating of paper , plastic film, metallic foil, and other web surfaces located at a
Code of Federal Regulations, 2010 CFR
2010-07-01
... must be certified by the manufacturer to be accurate to within ±2.0 percent by mass. (e) Continuous... Pollutants: Paper and Other Web Coating General Requirements for Compliance with the Emission Standards and... standards, what monitoring must I do? (a) A summary of monitoring you must do follows: If you operate a web...
Code of Federal Regulations, 2011 CFR
2011-07-01
... must be certified by the manufacturer to be accurate to within ±2.0 percent by mass. (e) Continuous... Pollutants: Paper and Other Web Coating General Requirements for Compliance with the Emission Standards and... standards, what monitoring must I do? (a) A summary of monitoring you must do follows: If you operate a web...
Karim, Md Rezaul; Michel, Audrey; Zappa, Achille; Baranov, Pavel; Sahay, Ratnesh; Rebholz-Schuhmann, Dietrich
2017-04-16
Data workflow systems (DWFSs) enable bioinformatics researchers to combine components for data access and data analytics, and to share the final data analytics approach with their collaborators. Increasingly, such systems have to cope with large-scale data, such as full genomes (about 200 GB each), public fact repositories (about 100 TB of data) and 3D imaging data at even larger scales. As moving the data becomes cumbersome, the DWFS needs to embed its processes into a cloud infrastructure, where the data are already hosted. As the standardized public data play an increasingly important role, the DWFS needs to comply with Semantic Web technologies. This advancement to DWFS would reduce overhead costs and accelerate the progress in bioinformatics research based on large-scale data and public resources, as researchers would require less specialized IT knowledge for the implementation. Furthermore, the high data growth rates in bioinformatics research drive the demand for parallel and distributed computing, which then imposes a need for scalability and high-throughput capabilities onto the DWFS. As a result, requirements for data sharing and access to public knowledge bases suggest that compliance of the DWFS with Semantic Web standards is necessary. In this article, we will analyze the existing DWFS with regard to their capabilities toward public open data use as well as large-scale computational and human interface requirements. We untangle the parameters for selecting a preferable solution for bioinformatics research with particular consideration to using cloud services and Semantic Web technologies. Our analysis leads to research guidelines and recommendations toward the development of future DWFS for the bioinformatics research community. © The Author 2017. Published by Oxford University Press.
Developing a modular architecture for creation of rule-based clinical diagnostic criteria.
Hong, Na; Pathak, Jyotishman; Chute, Christopher G; Jiang, Guoqian
2016-01-01
With recent advances in computerized patient records system, there is an urgent need for producing computable and standards-based clinical diagnostic criteria. Notably, constructing rule-based clinical diagnosis criteria has become one of the goals in the International Classification of Diseases (ICD)-11 revision. However, few studies have been done in building a unified architecture to support the need for diagnostic criteria computerization. In this study, we present a modular architecture for enabling the creation of rule-based clinical diagnostic criteria leveraging Semantic Web technologies. The architecture consists of two modules: an authoring module that utilizes a standards-based information model and a translation module that leverages Semantic Web Rule Language (SWRL). In a prototype implementation, we created a diagnostic criteria upper ontology (DCUO) that integrates ICD-11 content model with the Quality Data Model (QDM). Using the DCUO, we developed a transformation tool that converts QDM-based diagnostic criteria into Semantic Web Rule Language (SWRL) representation. We evaluated the domain coverage of the upper ontology model using randomly selected diagnostic criteria from broad domains (n = 20). We also tested the transformation algorithms using 6 QDM templates for ontology population and 15 QDM-based criteria data for rule generation. As the results, the first draft of DCUO contains 14 root classes, 21 subclasses, 6 object properties and 1 data property. Investigation Findings, and Signs and Symptoms are the two most commonly used element types. All 6 HQMF templates are successfully parsed and populated into their corresponding domain specific ontologies and 14 rules (93.3 %) passed the rule validation. Our efforts in developing and prototyping a modular architecture provide useful insight into how to build a scalable solution to support diagnostic criteria representation and computerization.
Wearable Technology for Global Surgical Teleproctoring.
Datta, Néha; MacQueen, Ian T; Schroeder, Alexander D; Wilson, Jessica J; Espinoza, Juan C; Wagner, Justin P; Filipi, Charles J; Chen, David C
2015-01-01
In underserved communities around the world, inguinal hernias represent a significant burden of surgically-treatable disease. With traditional models of international surgical assistance limited to mission trips, a standardized framework to strengthen local healthcare systems is lacking. We established a surgical education model using web-based tools and wearable technology to allow for long-term proctoring and assessment in a resource-poor setting. This is a feasibility study examining wearable technology and web-based performance rating tools for long-term proctoring in an international setting. Using the Lichtenstein inguinal hernia repair as the index surgical procedure, local surgeons in Paraguay and Brazil were trained in person by visiting international expert trainers using a formal, standardized teaching protocol. Surgeries were captured in real-time using Google Glass and transmitted wirelessly to an online video stream, permitting real-time observation and proctoring by mentoring surgeon experts in remote locations around the world. A system for ongoing remote evaluation and support by experienced surgeons was established using the Lichtenstein-specific Operative Performance Rating Scale. Data were collected from 4 sequential training operations for surgeons trained in both Paraguay and Brazil. With continuous internet connectivity, live streaming of the surgeries was successful. The Operative Performance Rating Scale was immediately used after each operation. Both surgeons demonstrated proficiency at the completion of the fourth case. A sustainable model for surgical training and proctoring to empower local surgeons in resource-poor locations and "train trainers" is feasible with wearable technology and web-based communication. Capacity building by maximizing use of local resources and expertise offers a long-term solution to reducing the global burden of surgically-treatable disease. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Mroczkiewicz, Pawel
A necessity of integration of both information systems and office software existing in organizations has had a long history. The beginning of this kind of solutions reaches back to the old generation of network protocols called EDI (Electronic Data Interchange) and EDIFACT standard, which was initiated in 1988 and has dynamically evolved ever since (S. Michalski, M. Suskiewicz, 1995). The mentioned protocol was usually used for converting documents into natural formats processed by applications. It caused problems with binary files and, furthermore, the communication mechanisms had to be modified each time new documents or applications were added. When we compare EDI with the previously used communication mechanisms, EDI was a great step forward as it was the first, big scale attempt to define standards of data interchange between the applications in business transactions (V. Leyland, 1995, p. 47).
A web service for service composition to aid geospatial modelers
NASA Astrophysics Data System (ADS)
Bigagli, L.; Santoro, M.; Roncella, R.; Mazzetti, P.
2012-04-01
The identification of appropriate mechanisms for process reuse, chaining and composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. In the Earth and Space Sciences, such a facility could primarily enable integrated and interoperable modeling, for what several approaches have been proposed and developed, over the last years. In fact, GEOSS is specifically tasked with the development of the so-called "Model Web". At increasing levels of abstraction and generalization, the initial stove-pipe software tools have evolved to community-wide modeling frameworks, to Component-Based Architecture solution, and, more recently, started to embrace Service-Oriented Architectures technologies, such as the OGC WPS specification and the WS-* stack of W3C standards for service composition. However, so far, the level of abstraction seems too low for implementing the Model Web vision, and far too complex technological aspects must still be addressed by both providers and users, resulting in limited usability and, eventually, difficult uptake. As by the recent ICT trend of resource virtualization, it has been suggested that users in need of a particular processing capability, required by a given modeling workflow, may benefit from outsourcing the composition activities into an external first-class service, according to the Composition as a Service (CaaS) approach. A CaaS system provides the necessary interoperability service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general) in the form of executable workflows. This work introduces the architecture of a CaaS system, as a distributed information system for creating, validating, editing, storing, publishing, and executing geospatial workflows. This way, the users can be freed from the need of a composition infrastructure and alleviated from the technicalities of workflow definitions (type matching, identification of external services endpoints, binding issues, etc.) and focus on their intended application. Moreover, the user may submit an incomplete workflow definition, and leverage CaaS recommendations (that may derive from an aggregated knowledge base of user feedback, underpinned by Web 2.0 technologies) to execute it. This is of particular interest for multidisciplinary scientific contexts, where different communities may benefit of each other knowledge through model chaining. Indeed, the CaaS approach is presented as an attempt to combine the recent advances in service-oriented computing with collaborative research principles, and social network information in general. Arguably, it may be considered a fundamental capability of the Model Web. The CaaS concept is being investigated in several application scenarios identified in the FP7 UncertWeb and EuroGEOSS projects. Key aspects of the described CaaS solution are: it provides a standard WPS interface for invoking Business Processes and allows on the fly recursive compositions of Business Processes into other Composite Processes; it is designed according to the extended SOA (broker-based) and the System-of-Systems approach, to support the reuse and integration of existing resources, in compliance with the GEOSS Model Web architecture. The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 248488.
The OGC Sensor Web Enablement framework
NASA Astrophysics Data System (ADS)
Cox, S. J.; Botts, M.
2006-12-01
Sensor observations are at the core of natural sciences. Improvements in data-sharing technologies offer the promise of much greater utilisation of observational data. A key to this is interoperable data standards. The Open Geospatial Consortium's (OGC) Sensor Web Enablement initiative (SWE) is developing open standards for web interfaces for the discovery, exchange and processing of sensor observations, and tasking of sensor systems. The goal is to support the construction of complex sensor applications through real-time composition of service chains from standard components. The framework is based around a suite of standard interfaces, and standard encodings for the message transferred between services. The SWE interfaces include: Sensor Observation Service (SOS)-parameterized observation requests (by observation time, feature of interest, property, sensor); Sensor Planning Service (SPS)-tasking a sensor- system to undertake future observations; Sensor Alert Service (SAS)-subscription to an alert, usually triggered by a sensor result exceeding some value. The interface design generally follows the pattern established in the OGC Web Map Service (WMS) and Web Feature Service (WFS) interfaces, where the interaction between a client and service follows a standard sequence of requests and responses. The first obtains a general description of the service capabilities, followed by obtaining detail required to formulate a data request, and finally a request for a data instance or stream. These may be implemented in a stateless "REST" idiom, or using conventional "web-services" (SOAP) messaging. In a deployed system, the SWE interfaces are supplemented by Catalogue, data (WFS) and portrayal (WMS) services, as well as authentication and rights management. The standard SWE data formats are Observations and Measurements (O&M) which encodes observation metadata and results, Sensor Model Language (SensorML) which describes sensor-systems, Transducer Model Language (TML) which covers low-level data streams, and domain-specific GML Application Schemas for definitions of the target feature types. The SWE framework has been demonstrated in several interoperability testbeds. These were based around emergency management, security, contamination and environmental monitoring scenarios.
NASA Astrophysics Data System (ADS)
Auer, M.; Agugiaro, G.; Billen, N.; Loos, L.; Zipf, A.
2014-05-01
Many important Cultural Heritage sites have been studied over long periods of time by different means of technical equipment, methods and intentions by different researchers. This has led to huge amounts of heterogeneous "traditional" datasets and formats. The rising popularity of 3D models in the field of Cultural Heritage in recent years has brought additional data formats and makes it even more necessary to find solutions to manage, publish and study these data in an integrated way. The MayaArch3D project aims to realize such an integrative approach by establishing a web-based research platform bringing spatial and non-spatial databases together and providing visualization and analysis tools. Especially the 3D components of the platform use hierarchical segmentation concepts to structure the data and to perform queries on semantic entities. This paper presents a database schema to organize not only segmented models but also different Levels-of-Details and other representations of the same entity. It is further implemented in a spatial database which allows the storing of georeferenced 3D data. This enables organization and queries by semantic, geometric and spatial properties. As service for the delivery of the segmented models a standardization candidate of the OpenGeospatialConsortium (OGC), the Web3DService (W3DS) has been extended to cope with the new database schema and deliver a web friendly format for WebGL rendering. Finally a generic user interface is presented which uses the segments as navigation metaphor to browse and query the semantic segmentation levels and retrieve information from an external database of the German Archaeological Institute (DAI).
NASA Astrophysics Data System (ADS)
Colomo-Palacios, Ricardo; Jiménez-López, Diego; García-Crespo, Ángel; Blanco-Iglesias, Borja
eLearning educative processes are a challenge for educative institutions and education professionals. In an environment in which learning resources are being produced, catalogued and stored using innovative ways, SOLE provides a platform in which exam questions can be produced supported by Web 2.0 tools, catalogued and labeled via semantic web and stored and distributed using eLearning standards. This paper presents, SOLE, a social network of exam questions sharing particularized for Software Engineering domain, based on semantics and built using semantic web and eLearning standards, such as IMS Question and Test Interoperability specification 2.1.
Business intelligence and capacity planning: web-based solutions.
James, Roger
2010-07-01
Income (activity) and expenditure (costs) form the basis of a modern hospital's 'business intelligence'. However, clinical engagement in business intelligence is patchy. This article describes the principles of business intelligence and outlines some recent developments using web-based applications.
Lamprey: tracking users on the World Wide Web.
Felciano, R M; Altman, R B
1996-01-01
Tracking individual web sessions provides valuable information about user behavior. This information can be used for general purpose evaluation of web-based user interfaces to biomedical information systems. To this end, we have developed Lamprey, a tool for doing quantitative and qualitative analysis of Web-based user interfaces. Lamprey can be used from any conforming browser, and does not require modification of server or client software. By rerouting WWW navigation through a centralized filter, Lamprey collects the sequence and timing of hyperlinks used by individual users to move through the web. Instead of providing marginal statistics, it retains the full information required to recreate a user session. We have built Lamprey as a standard Common Gateway Interface (CGI) that works with all standard WWW browsers and servers. In this paper, we describe Lamprey and provide a short demonstration of this approach for evaluating web usage patterns.
Eccher, Claudio; Eccher, Lorenzo; Izzo, Umberto
2005-01-01
In this poster we describe the security solutions implemented in a web-based cooperative work frame-work for managing heart failure patients among different health care professionals involved in the care process. The solution, developed in close collaboration with the Law Department of the University of Trento, is compliant with the new Italian Personal Data Protection Code, issued in 2003, that regulates also the storing and processing of health data.
Current Efforts in European Projects to Facilitate the Sharing of Scientific Observation Data
NASA Astrophysics Data System (ADS)
Bredel, Henning; Rieke, Matthes; Maso, Joan; Jirka, Simon; Stasch, Christoph
2017-04-01
This presentation is intended to provide an overview of currently ongoing efforts in European projects to facilitate and promote the interoperable sharing of scientific observation data. This will be illustrated through two examples: a prototypical portal developed in the ConnectinGEO project for matching available (in-situ) data sources to the needs of users and a joint activity of several research projects to harmonise the usage of the OGC Sensor Web Enablement standards for providing access to marine observation data. ENEON is an activity initiated by the European ConnectinGEO project to coordinate in-situ Earth observation networks with the aim to harmonise the access to observations, improve discoverability, and identify/close gaps in European earth observation data resources. In this context, ENEON commons has been developed as a supporting Web portal for facilitating discovery, access, re-use and creation of knowledge about observations, networks, and related activities (e.g. projects). The portal is based on developments resulting from the European WaterInnEU project and has been extended to cover the requirements for handling knowledge about in-situ earth observation networks. A first prototype of the portal was completed in January 2017 which offers functionality for interactive discussion, information exchange and querying information about data delivered by different observation networks. Within this presentation, we will introduce the presented prototype and initiate a discussion about potential future work directions. The second example concerns the harmonisation of data exchange in the marine domain. There are many organisation who operate ocean observatories or data archives. In recent years, the application of the OGC Sensor Web Enablement (SWE) technology has become more and more popular to increase the interoperability between marine observation networks. However, as the SWE standards were intentionally designed in a domain independent manner, there are still a significant degrees of freedom how the same information could be handled in the SWE framework. Thus, further domain-specific agreements are necessary to describe more precisely, how SWE standards shall be applied in specific contexts. Within this presentation we will report the current status of the marine SWE profiles initiative which has the aim to develop guidance and recommendations for the application of SWE standards for ocean observation data. This initiative which is supported by projects such as NeXOS, FixO3, ODIP 2, BRIDGES and SeaDataCloud has already lead to first results, which will be introduced in the proposed presentation. In summary we will introduce two different building blocks how earth observation networks can be coordinated to ensure better discoverability through intelligent portal solutions and to ensure a common, interoperable exchange of the collected data through dedicated domain profiles of Sensor Web standard.
Modular VO oriented Java EE service deployer
NASA Astrophysics Data System (ADS)
Molinaro, Marco; Cepparo, Francesco; De Marco, Marco; Knapic, Cristina; Apollo, Pietro; Smareglia, Riccardo
2014-07-01
The International Virtual Observatory Alliance (IVOA) has produced many standards and recommendations whose aim is to generate an architecture that starts from astrophysical resources, in a general sense, and ends up in deployed consumable services (that are themselves astrophysical resources). Focusing on the Data Access Layer (DAL) system architecture, that these standards define, in the last years a web based application has been developed and maintained at INAF-OATs IA2 (Italian National institute for Astrophysics - Astronomical Observatory of Trieste, Italian center of Astronomical Archives) to try to deploy and manage multiple VO (Virtual Observatory) services in a uniform way: VO-Dance. However a set of criticalities have arisen since when the VO-Dance idea has been produced, plus some major changes underwent and are undergoing at the IVOA DAL layer (and related standards): this urged IA2 to identify a new solution for its own service layer. Keeping on the basic ideas from VO-Dance (simple service configuration, service instantiation at call time and modularity) while switching to different software technologies (e.g. dismissing Java Reflection in favour of Enterprise Java Bean, EJB, based solution), the new solution has been sketched out and tested for feasibility. Here we present the results originating from this test study. The main constraints for this new project come from various fields. A better homogenized solution rising from IVOA DAL standards: for example the new DALI (Data Access Layer Interface) specification that acts as a common interface system for previous and oncoming access protocols. The need for a modular system where each component is based upon a single VO specification allowing services to rely on common capabilities instead of homogenizing them inside service components directly. The search for a scalable system that takes advantage from distributed systems. The constraints find answer in the adopted solutions hereafter sketched. The development of the new system using Java Enterprise technologies can better benefit from existing libraries to build up the single tokens implementing the IVOA standards. Each component can be built from single standards and each deployed service (i.e. service components instantiations) can consume the other components' exposed methods and services without the need of homogenizing them in dedicated libraries. Scalability can be achieved in an easier way by deploying components or sets of services on a distributed environment and using JNDI (Java Naming and Directory Interface) and RMI (Remote Method Invocation) technologies. Single service configuration will not be significantly different from the VO-Dance solution given that Java class instantiation that benefited from Java Reflection will only be moved to Java EJB pooling (and not, e.g. embedded in bundles for subsequent deployment).
ERIC Educational Resources Information Center
Hill, Linda L.; Crosier, Scott J.; Smith, Terrence R.; Goodchild, Michael; Iannella, Renato; Erickson, John S.; Reich, Vicky; Rosenthal, David S. H.
2001-01-01
Includes five articles. Topics include requirements for a content standard to describe computational models; architectures for digital rights management systems; access control for digital information objects; LOCKSS (Lots of Copies Keep Stuff Safe) that allows libraries to run Web caches for specific journals; and a Web site from the U.S.…
A Model Privacy Statement for Ohio Library Web Sites.
ERIC Educational Resources Information Center
Monaco, Michael J.
The purpose of this research was to develop a model privacy policy statement for library World Wide Web sites. First, standards of privacy protection were identified. These standards were culled from the privacy and confidentiality policies of the American Library Association, the Federal Trade Commission's online privacy reports, the guidelines…
A Natural Fit: Problem-based Learning and Technology Standards.
ERIC Educational Resources Information Center
Sage, Sara M.
2000-01-01
Discusses the use of problem-based learning to meet technology standards. Highlights include technology as a tool for locating and organizing information; the Wolf Wars problem for elementary and secondary school students that provides resources, including Web sites, for information; Web-based problems; and technology as assessment and as a…
29 CFR 1918.3 - Incorporation by reference.
Code of Federal Regulations, 2010 CFR
2010-07-01
....org. (5) ANSI Z41-1991, American National Standard for Personal Protection—Protective Footwear; IBR...; Web site: http://www.nsc.org. (6) ANSI Z87.1-2003, American National Standard Practice for...; fax: 703-528-2148; Web site: http://www.safetyequipment.org. (7) ANSI Z87.1-1989 (R-1998), American...
Neuhaus, Philipp; Doods, Justin; Dugas, Martin
2015-01-01
Automatic coding of medical terms is an important, but highly complicated and laborious task. To compare and evaluate different strategies a framework with a standardized web-interface was created. Two UMLS mapping strategies are compared to demonstrate the interface. The framework is a Java Spring application running on a Tomcat application server. It accepts different parameters and returns results in JSON format. To demonstrate the framework, a list of medical data items was mapped by two different methods: similarity search in a large table of terminology codes versus search in a manually curated repository. These mappings were reviewed by a specialist. The evaluation shows that the framework is flexible (due to standardized interfaces like HTTP and JSON), performant and reliable. Accuracy of automatically assigned codes is limited (up to 40%). Combining different semantic mappers into a standardized Web-API is feasible. This framework can be easily enhanced due to its modular design.
Baby Steps: Starting Out on the World Wide Web.
ERIC Educational Resources Information Center
Simpson, Carol; McElmeel, Sharron L.
1997-01-01
While the Internet is the physical medium used to transport data, the World Wide Web is the collection of protocols and standards used to access the information. This article provides a basic explanation of what the Web is and describes common browser commands. Discusses graphic Web browsers; universal resource locators (URLs); file, message,…
Web processing service for landslide hazard assessment
NASA Astrophysics Data System (ADS)
Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.
2012-04-01
Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created and it can be downloaded from the server with the log file
2013-01-01
Background Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Results Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes: • Support for multi-component compounds (mixtures) • Import and export of SD-files • Optional security (authorization) For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures). Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. Conclusions By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework. PMID:24325762
Kiener, Joos
2013-12-11
Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework.
Silicon web process development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.
1977-01-01
Thirty-five (35) furnace runs were carried out during this quarter, of which 25 produced a total of 120 web crystals. The two main thermal models for the dendritic growth process were completed and are being used to assist the design of the thermal geometry of the web growth apparatus. The first model, a finite element representation of the susceptor and crucible, was refined to give greater precision and resolution in the critical central region of the melt. The second thermal model, which describes the dissipation of the latent heat to generate thickness-velocity data, was completed. Dendritic web samples were fabricated into solar cells using a standard configuration and a standard process for a N(+) -P-P(+) configuration. The detailed engineering design was completed for a new dendritic web growth facility of greater width capability than previous facilities.
Stewart, Tiffany; Han, Hongmei; Allen, H. Raymond; Bathalon, COL Gaston; Ryan, Donna H.; Newton, Robert L.; Williamson, Donald A.
2011-01-01
Background A significant number of soldiers exceed the maximum allowable weight standards or have body weights approaching the maximum allowable weight standards. This mandates development of scalable approaches to improve compliance with military weight standards. Methods We developed an intervention that included two components: (1) an Internet-based weight management program (Web site) and (2) a promotion program designed to promote and sustain usage of the Web site. The Web site remained online for 37 months, with the Web site promotion program ending after 25 months. Results Soldiers’ demographics were as follows: mean age, 32 years; body mass index (BMI), 28 kg/m2; 31% female; and 58% Caucasian. Civilian demographics were as follows: mean age, 38 years; BMI, 30 kg/m2; 84% female; and 55% Caucasian. Results indicated that 2417 soldiers and 2147 civilians (N = 4564) registered on the Web site. In the first 25 months (phase 1) of the study, new participants enrolled on the Web site at a rate of 88 (soldiers) and 80 (civilians) per month. After the promotion program was removed (phase 2), new participants enrolled at a rate of 18 (soldiers) and 13 (civilians) per month. Utilization of the Web site was associated with self-reported weight loss (p < .0001). Participants who utilized the Web site more frequently lost more weight (p < .0001). Participants reported satisfaction with the Web site. Conclusions The Web site and accompanying promotion program, when implemented at a military base, received satisfactory ratings and benefited a subset of participants in promoting weight loss. This justifies further examination of effectiveness in a randomized trial setting. PMID:21303642
The BiSciCol Triplifier: bringing biodiversity data to the Semantic Web.
Stucky, Brian J; Deck, John; Conlin, Tom; Ziemba, Lukasz; Cellinese, Nico; Guralnick, Robert
2014-07-29
Recent years have brought great progress in efforts to digitize the world's biodiversity data, but integrating data from many different providers, and across research domains, remains challenging. Semantic Web technologies have been widely recognized by biodiversity scientists for their potential to help solve this problem, yet these technologies have so far seen little use for biodiversity data. Such slow uptake has been due, in part, to the relative complexity of Semantic Web technologies along with a lack of domain-specific software tools to help non-experts publish their data to the Semantic Web. The BiSciCol Triplifier is new software that greatly simplifies the process of converting biodiversity data in standard, tabular formats, such as Darwin Core-Archives, into Semantic Web-ready Resource Description Framework (RDF) representations. The Triplifier uses a vocabulary based on the popular Darwin Core standard, includes both Web-based and command-line interfaces, and is fully open-source software. Unlike most other RDF conversion tools, the Triplifier does not require detailed familiarity with core Semantic Web technologies, and it is tailored to a widely popular biodiversity data format and vocabulary standard. As a result, the Triplifier can often fully automate the conversion of biodiversity data to RDF, thereby making the Semantic Web much more accessible to biodiversity scientists who might otherwise have relatively little knowledge of Semantic Web technologies. Easy availability of biodiversity data as RDF will allow researchers to combine data from disparate sources and analyze them with powerful linked data querying tools. However, before software like the Triplifier, and Semantic Web technologies in general, can reach their full potential for biodiversity science, the biodiversity informatics community must address several critical challenges, such as the widespread failure to use robust, globally unique identifiers for biodiversity data.
Applying Semantic Web Services and Wireless Sensor Networks for System Integration
NASA Astrophysics Data System (ADS)
Berkenbrock, Gian Ricardo; Hirata, Celso Massaki; de Oliveira Júnior, Frederico Guilherme Álvares; de Oliveira, José Maria Parente
In environments like factories, buildings, and homes automation services tend to often change during their lifetime. Changes are concerned to business rules, process optimization, cost reduction, and so on. It is important to provide a smooth and straightforward way to deal with these changes so that could be handled in a faster and low cost manner. Some prominent solutions use the flexibility of Wireless Sensor Networks and the meaningful description of Semantic Web Services to provide service integration. In this work, we give an overview of current solutions for machinery integration that combine both technologies as well as a discussion about some perspectives and open issues when applying Wireless Sensor Networks and Semantic Web Services for automation services integration.
A simple web-based tool to compare freshwater fish data collected using AFS standard methods
Bonar, Scott A.; Mercado-Silva, Norman; Rahr, Matt; Torrey, Yuta T.; Cate, Averill
2016-01-01
The American Fisheries Society (AFS) recently published Standard Methods for Sampling North American Freshwater Fishes. Enlisting the expertise of 284 scientists from 107 organizations throughout Canada, Mexico, and the United States, this text was developed to facilitate comparisons of fish data across regions or time. Here we describe a user-friendly web tool that automates among-sample comparisons in individual fish condition, population length-frequency distributions, and catch per unit effort (CPUE) data collected using AFS standard methods. Currently, the web tool (1) provides instantaneous summaries of almost 4,000 data sets of condition, length frequency, and CPUE of common freshwater fishes collected using standard gears in 43 states and provinces; (2) is easily appended with new standardized field data to update subsequent queries and summaries; (3) compares fish data from a particular water body with continent, ecoregion, and state data summaries; and (4) provides additional information about AFS standard fish sampling including benefits, ongoing validation studies, and opportunities to comment on specific methods. The web tool—programmed in a PHP-based Drupal framework—was supported by several AFS Sections, agencies, and universities and is freely available from the AFS website and fisheriesstandardsampling.org. With widespread use, the online tool could become an important resource for fisheries biologists.
NASA Astrophysics Data System (ADS)
Ablowitz, Mark J.; Curtis, Christopher W.
2011-05-01
The Benney-Luke equation, which arises as a long wave asymptotic approximation of water waves, contains the Kadomtsev-Petviashvilli (KP) equation as a leading-order maximal balanced approximation. The question analyzed is how the Benney-Luke equation modifies the so-called web solutions of the KP equation. It is found that the Benney-Luke equation introduces dispersive radiation which breaks each of the symmetric soliton-like humps well away from the interaction region of the KP web solution into a tail of multi-peaked oscillating profiles behind the main solitary hump. Computation indicates that the wave structure is modified near the center of the interaction region. Both analytical and numerical techniques are employed for working with non-periodic, non-decaying solutions on unbounded domains.
WebSat ‐ A web software for microsatellite marker development
Martins, Wellington Santos; Soares Lucas, Divino César; de Souza Neves, Kelligton Fabricio; Bertioli, David John
2009-01-01
Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. Availability The web tool may be accessed at http://purl.oclc.org/NET/websat/ PMID:19255650
Hazlehurst, Brian L; Kurtz, Stephen E; Masica, Andrew; Stevens, Victor J; McBurnie, Mary Ann; Puro, Jon E; Vijayadeva, Vinutha; Au, David H; Brannon, Elissa D; Sittig, Dean F
2015-10-01
Comparative effectiveness research (CER) requires the capture and analysis of data from disparate sources, often from a variety of institutions with diverse electronic health record (EHR) implementations. In this paper we describe the CER Hub, a web-based informatics platform for developing and conducting research studies that combine comprehensive electronic clinical data from multiple health care organizations. The CER Hub platform implements a data processing pipeline that employs informatics standards for data representation and web-based tools for developing study-specific data processing applications, providing standardized access to the patient-centric electronic health record (EHR) across organizations. The CER Hub is being used to conduct two CER studies utilizing data from six geographically distributed and demographically diverse health systems. These foundational studies address the effectiveness of medications for controlling asthma and the effectiveness of smoking cessation services delivered in primary care. The CER Hub includes four key capabilities: the ability to process and analyze both free-text and coded clinical data in the EHR; a data processing environment supported by distributed data and study governance processes; a clinical data-interchange format for facilitating standardized extraction of clinical data from EHRs; and a library of shareable clinical data processing applications. CER requires coordinated and scalable methods for extracting, aggregating, and analyzing complex, multi-institutional clinical data. By offering a range of informatics tools integrated into a framework for conducting studies using EHR data, the CER Hub provides a solution to the challenges of multi-institutional research using electronic medical record data. Copyright © 2015. Published by Elsevier Ireland Ltd.
Adding Processing Functionality to the Sensor Web
NASA Astrophysics Data System (ADS)
Stasch, Christoph; Pross, Benjamin; Jirka, Simon; Gräler, Benedikt
2017-04-01
The Sensor Web allows discovering, accessing and tasking different kinds of environmental sensors in the Web, ranging from simple in-situ sensors to remote sensing systems. However, (geo-)processing functionality needs to be applied to integrate data from different sensor sources and to generate higher level information products. Yet, a common standardized approach for processing sensor data in the Sensor Web is still missing and the integration differs from application to application. Standardizing not only the provision of sensor data, but also the processing facilitates sharing and re-use of processing modules, enables reproducibility of processing results, and provides a common way to integrate external scalable processing facilities or legacy software. In this presentation, we provide an overview on on-going research projects that develop concepts for coupling standardized geoprocessing technologies with Sensor Web technologies. At first, different architectures for coupling sensor data services with geoprocessing services are presented. Afterwards, profiles for linear regression and spatio-temporal interpolation of the OGC Web Processing Services that allow consuming sensor data coming from and uploading predictions to Sensor Observation Services are introduced. The profiles are implemented in processing services for the hydrological domain. Finally, we illustrate how the R software can be coupled with existing OGC Sensor Web and Geoprocessing Services and present an example, how a Web app can be built that allows exploring the results of environmental models in an interactive way using the R Shiny framework. All of the software presented is available as Open Source Software.
A new reference implementation of the PSICQUIC web service.
del-Toro, Noemi; Dumousseau, Marine; Orchard, Sandra; Jimenez, Rafael C; Galeota, Eugenia; Launay, Guillaume; Goll, Johannes; Breuer, Karin; Ono, Keiichiro; Salwinski, Lukasz; Hermjakob, Henning
2013-07-01
The Proteomics Standard Initiative Common QUery InterfaCe (PSICQUIC) specification was created by the Human Proteome Organization Proteomics Standards Initiative (HUPO-PSI) to enable computational access to molecular-interaction data resources by means of a standard Web Service and query language. Currently providing >150 million binary interaction evidences from 28 servers globally, the PSICQUIC interface allows the concurrent search of multiple molecular-interaction information resources using a single query. Here, we present an extension of the PSICQUIC specification (version 1.3), which has been released to be compliant with the enhanced standards in molecular interactions. The new release also includes a new reference implementation of the PSICQUIC server available to the data providers. It offers augmented web service capabilities and improves the user experience. PSICQUIC has been running for almost 5 years, with a user base growing from only 4 data providers to 28 (April 2013) allowing access to 151 310 109 binary interactions. The power of this web service is shown in PSICQUIC View web application, an example of how to simultaneously query, browse and download results from the different PSICQUIC servers. This application is free and open to all users with no login requirement (http://www.ebi.ac.uk/Tools/webservices/psicquic/view/main.xhtml).
Development of an IHE MRRT-compliant open-source web-based reporting platform.
Pinto Dos Santos, Daniel; Klos, G; Kloeckner, R; Oberle, R; Dueber, C; Mildenberger, P
2017-01-01
To develop a platform that uses structured reporting templates according to the IHE Management of Radiology Report Templates (MRRT) profile, and to implement this platform into clinical routine. The reporting platform uses standard web technologies (HTML / JavaScript and PHP / MySQL) only. Several freely available external libraries were used to simplify the programming. The platform runs on a standard web server, connects with the radiology information system (RIS) and PACS, and is easily accessible via a standard web browser. A prototype platform that allows structured reporting to be easily incorporated into the clinical routine was developed and successfully tested. To date, 797 reports were generated using IHE MRRT-compliant templates (many of them downloaded from the RSNA's radreport.org website). Reports are stored in a MySQL database and are easily accessible for further analyses. Development of an IHE MRRT-compliant platform for structured reporting is feasible using only standard web technologies. All source code will be made available upon request under a free license, and the participation of other institutions in further development is welcome. • A platform for structured reporting using IHE MRRT-compliant templates is presented. • Incorporating structured reporting into clinical routine is feasible. • Full source code will be provided upon request under a free license.
da Silva, Kátia Regina; Costa, Roberto; Crevelari, Elizabeth Sartori; Lacerda, Marianna Sobral; de Moraes Albertini, Caio Marcos; Filho, Martino Martinelli; Santana, José Eduardo; Vissoci, João Ricardo Nickenig; Pietrobon, Ricardo; Barros, Jacson V.
2013-01-01
Background The ability to apply standard and interoperable solutions for implementing and managing medical registries as well as aggregate, reproduce, and access data sets from legacy formats and platforms to advanced standard formats and operating systems are crucial for both clinical healthcare and biomedical research settings. Purpose Our study describes a reproducible, highly scalable, standard framework for a device registry implementation addressing both local data quality components and global linking problems. Methods and Results We developed a device registry framework involving the following steps: (1) Data standards definition and representation of the research workflow, (2) Development of electronic case report forms using REDCap (Research Electronic Data Capture), (3) Data collection according to the clinical research workflow and, (4) Data augmentation by enriching the registry database with local electronic health records, governmental database and linked open data collections, (5) Data quality control and (6) Data dissemination through the registry Web site. Our registry adopted all applicable standardized data elements proposed by American College Cardiology / American Heart Association Clinical Data Standards, as well as variables derived from cardiac devices randomized trials and Clinical Data Interchange Standards Consortium. Local interoperability was performed between REDCap and data derived from Electronic Health Record system. The original data set was also augmented by incorporating the reimbursed values paid by the Brazilian government during a hospitalization for pacemaker implantation. By linking our registry to the open data collection repository Linked Clinical Trials (LinkedCT) we found 130 clinical trials which are potentially correlated with our pacemaker registry. Conclusion This study demonstrates how standard and reproducible solutions can be applied in the implementation of medical registries to constitute a re-usable framework. Such approach has the potential to facilitate data integration between healthcare and research settings, also being a useful framework to be used in other biomedical registries. PMID:23936257
OneGeology-Europe: architecture, portal and web services to provide a European geological map
NASA Astrophysics Data System (ADS)
Tellez-Arenas, Agnès.; Serrano, Jean-Jacques; Tertre, François; Laxton, John
2010-05-01
OneGeology-Europe is a large ambitious project to make geological spatial data further known and accessible. The OneGeology-Europe project develops an integrated system of data to create and make accessible for the first time through the internet the geological map of the whole of Europe. The architecture implemented by the project is web services oriented, based on the OGC standards: the geological map is not a centralized database but is composed by several web services, each of them hosted by a European country involved in the project. Since geological data are elaborated differently from country to country, they are difficult to share. OneGeology-Europe, while providing more detailed and complete information, will foster even beyond the geological community an easier exchange of data within Europe and globally. This implies an important work regarding the harmonization of the data, both model and the content. OneGeology-Europe is characterised by the high technological capacity of the EU Member States, and has the final goal to achieve the harmonisation of European geological survey data according to common standards. As a direct consequence Europe will make a further step in terms of innovation and information dissemination, continuing to play a world leading role in the development of geosciences information. The scope of the common harmonized data model was defined primarily by the requirements of the geological map of Europe, but in addition users were consulted and the requirements of both INSPIRE and ‘high-resolution' geological maps were considered. The data model is based on GeoSciML, developed since 2006 by a group of Geological Surveys. The data providers involved in the project implemented a new component that allows the web services to deliver the geological map expressed into GeoSciML. In order to capture the information describing the geological units of the map of Europe the scope of the data model needs to include lithology; age; genesis and metamorphic character. For high resolution maps physical properties, bedding characteristics and weathering also need to be added. Furthermore, Geological data held by national geological surveys is generally described in national language of the country. The project has to deal with the multilingual issue, an important requirement of the INSPIRE directive. The project provides a list of harmonized vocabularies, a set of web services to deal with them, and a web site for helping the geoscientists while mapping the terms used into the national datasets into these vocabularies. The web services provided by each data provider, with the particular component that allows them to deliver the harmonised data model and to handle the multilingualism, are the first part of the architecture. The project also implements a web portal that provides several functionalities. Thanks to the common data model implemented by each web service delivering a part of the geological map, and using OGC SLD standards, the client offers the following option. A user can request for a sub-selection of the map, for instance searching on a particular attribute such as "age is quaternary", and display only the parts of the map according to the filter. Using the web services on the common vocabularies, the data displayed are translated. The project started September 2008 for two years, with 29 partners from 20 countries (20 partners are Geological Surveys). The budget is 3.25 M€, with a European Commission contribution of 2.6 M€. The paper will describe the technical solutions to implement OneGeology-Europe components: the profile of the common data model to exchange geological data, the web services to view and access geological data; and a geoportal to provide the user with a user-friendly way to discover, view and access geological data.
DrishtiCare: a telescreening platform for diabetic retinopathy powered with fundus image analysis.
Joshi, Gopal Datt; Sivaswamy, Jayanthi
2011-01-01
Diabetic retinopathy is the leading cause of blindness in urban populations. Early diagnosis through regular screening and timely treatment has been shown to prevent visual loss and blindness. It is very difficult to cater to this vast set of diabetes patients, primarily because of high costs in reaching out to patients and a scarcity of skilled personnel. Telescreening offers a cost-effective solution to reach out to patients but is still inadequate due to an insufficient number of experts who serve the diabetes population. Developments toward fundus image analysis have shown promise in addressing the scarcity of skilled personnel for large-scale screening. This article aims at addressing the underlying issues in traditional telescreening to develop a solution that leverages the developments carried out in fundus image analysis. We propose a novel Web-based telescreening solution (called DrishtiCare) integrating various value-added fundus image analysis components. A Web-based platform on the software as a service (SaaS) delivery model is chosen to make the service cost-effective, easy to use, and scalable. A server-based prescreening system is employed to scrutinize the fundus images of patients and to refer them to the experts. An automatic quality assessment module ensures transfer of fundus images that meet grading standards. An easy-to-use interface, enabled with new visualization features, is designed for case examination by experts. Three local primary eye hospitals have participated and used DrishtiCare's telescreening service. A preliminary evaluation of the proposed platform is performed on a set of 119 patients, of which 23% are identified with the sight-threatening retinopathy. Currently, evaluation at a larger scale is under process, and a total of 450 patients have been enrolled. The proposed approach provides an innovative way of integrating automated fundus image analysis in the telescreening framework to address well-known challenges in large-scale disease screening. It offers a low-cost, effective, and easily adoptable screening solution to primary care providers. © 2010 Diabetes Technology Society.
Bernal-Rusiel, Jorge L; Rannou, Nicolas; Gollub, Randy L; Pieper, Steve; Murphy, Shawn; Robertson, Richard; Grant, Patricia E; Pienaar, Rudolph
2017-01-01
In this paper we present a web-based software solution to the problem of implementing real-time collaborative neuroimage visualization. In both clinical and research settings, simple and powerful access to imaging technologies across multiple devices is becoming increasingly useful. Prior technical solutions have used a server-side rendering and push-to-client model wherein only the server has the full image dataset. We propose a rich client solution in which each client has all the data and uses the Google Drive Realtime API for state synchronization. We have developed a small set of reusable client-side object-oriented JavaScript modules that make use of the XTK toolkit, a popular open-source JavaScript library also developed by our team, for the in-browser rendering and visualization of brain image volumes. Efficient realtime communication among the remote instances is achieved by using just a small JSON object, comprising a representation of the XTK image renderers' state, as the Google Drive Realtime collaborative data model. The developed open-source JavaScript modules have already been instantiated in a web-app called MedView , a distributed collaborative neuroimage visualization application that is delivered to the users over the web without requiring the installation of any extra software or browser plugin. This responsive application allows multiple physically distant physicians or researchers to cooperate in real time to reach a diagnosis or scientific conclusion. It also serves as a proof of concept for the capabilities of the presented technological solution.
The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications
2011-01-01
Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv) incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework. PMID:21806842
The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications.
Katayama, Toshiaki; Wilkinson, Mark D; Vos, Rutger; Kawashima, Takeshi; Kawashima, Shuichi; Nakao, Mitsuteru; Yamamoto, Yasunori; Chun, Hong-Woo; Yamaguchi, Atsuko; Kawano, Shin; Aerts, Jan; Aoki-Kinoshita, Kiyoko F; Arakawa, Kazuharu; Aranda, Bruno; Bonnal, Raoul Jp; Fernández, José M; Fujisawa, Takatomo; Gordon, Paul Mk; Goto, Naohisa; Haider, Syed; Harris, Todd; Hatakeyama, Takashi; Ho, Isaac; Itoh, Masumi; Kasprzyk, Arek; Kido, Nobuhiro; Kim, Young-Joo; Kinjo, Akira R; Konishi, Fumikazu; Kovarskaya, Yulia; von Kuster, Greg; Labarga, Alberto; Limviphuvadh, Vachiranee; McCarthy, Luke; Nakamura, Yasukazu; Nam, Yunsun; Nishida, Kozo; Nishimura, Kunihiro; Nishizawa, Tatsuya; Ogishima, Soichi; Oinn, Tom; Okamoto, Shinobu; Okuda, Shujiro; Ono, Keiichiro; Oshita, Kazuki; Park, Keun-Joon; Putnam, Nicholas; Senger, Martin; Severin, Jessica; Shigemoto, Yasumasa; Sugawara, Hideaki; Taylor, James; Trelles, Oswaldo; Yamasaki, Chisato; Yamashita, Riu; Satoh, Noriyuki; Takagi, Toshihisa
2011-08-02
The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv) incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework.
Uncertainty in exposure to air pollution
NASA Astrophysics Data System (ADS)
Pebesma, Edzer; Helle, Kristina; Christoph, Stasch; Rasouli, Soora; Timmermans, Harry; Walker, Sam-Erik; Denby, Bruce
2013-04-01
To assess exposure to air pollution for a person or for a group of people, one needs to know where the person or group is as a function of time, and what the air pollution is at these times and locations. In this study we used the Albatross activity-based model to assess the whereabouts of people and the uncertainties in this, and a probabilistic air quality system based on TAPM/EPISODE to assess air quality probabilistically. The outcomes of the two models were combined to assess exposure to air pollution, and the errors in it. We used the area around Rotterdam (Netherlands) as a case study. As the outcomes of both models come as Monte Carlo realizations, it was relatively easy to cancel one of the sources of uncertainty (movement of persons, air pollution) in order to identify their respective contributions, and also to compare evaluations for individuals with averages for a population of persons. As the output is probabilistic, and in addition spatially and temporally varying, the visual analysis of the complete results poses some challenges. This case study was one of the test cases in the UncertWeb project, which has built concepts and tools to realize the uncertainty-enabled model web. Some of the tools and protocols will be shown and evaluated in this presentation. For the uncertainty of exposure, the uncertainty of air quality was more important than the uncertainty of peoples locations. This difference was stronger for PM10 than for NO2. The workflow was implemented as generic Web services in UncertWeb that also allow for other inputs than the simulated activity schedules and air quality with other resolution. However, due to this flexibility, the Web services require standardized formats and the overlay algorithm is not optimized for the specific use case resulting in a data and processing overhead. Hence, we implemented the full analysis in parallel in R, for this specific case as the model web solution had difficulties with massive data.
77 FR 73022 - U.S. Environmental Solutions Toolkit
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-07
... Commerce continues to develop the web- based U.S. Environmental Solutions Toolkit to be used by foreign environmental officials and foreign end-users of environmental technologies that will outline U.S. approaches to... DEPARTMENT OF COMMERCE International Trade Administration U.S. Environmental Solutions Toolkit...
World Wide Web Server Standards and Guidelines.
ERIC Educational Resources Information Center
Stubbs, Keith M.
This document defines the specific standards and general guidelines which the U.S. Department of Education (ED) will use to make information available on the World Wide Web (WWW). The purpose of providing such guidance is to ensure high quality and consistent content, organization, and presentation of information on ED WWW servers, in order to…
Code of Federal Regulations, 2010 CFR
2010-10-01
... tiedown assemblies. Tiedown assemblies (including chains, wire rope, steel strapping, synthetic webbing... . . . Must conform to . . . (1) Steel strapping 1,2 Standard Specification for Strapping, Flat Steel and... Association of Chain Manufacturers' Welded Steel Chain Specifications, dated September 28, 2005. 4 (3) Webbing...
Code of Federal Regulations, 2011 CFR
2011-10-01
... tiedown assemblies. Tiedown assemblies (including chains, wire rope, steel strapping, synthetic webbing... . . . Must conform to . . . (1) Steel strapping 1,2 Standard Specification for Strapping, Flat Steel and... Association of Chain Manufacturers' Welded Steel Chain Specifications, dated September 28, 2005. 4 (3) Webbing...
Weaving a Secure Web around Education: A Guide to Technology Standards and Security.
ERIC Educational Resources Information Center
National Forum on Education Statistics (ED/OERI), Washington, DC.
The purpose of this guidebook is to assist education agencies and organizations--which include state education agencies or state departments of education, school districts, and schools--in the development, maintenance, and standardization of effective Web sites. Also included is a detailed examination of the procedures necessary to provide…
Readability Levels of Health-Based Websites: From Content to Comprehension
ERIC Educational Resources Information Center
Schutten, Mary; McFarland, Allison
2009-01-01
Three of the national health education standards include decision-making, accessing information and analyzing influences. WebQuests are a popular inquiry-oriented method used by secondary teachers to help students achieve these content standards. While WebQuests support higher level thinking skills, the readability level of the information on the…
NASA Astrophysics Data System (ADS)
Parikh, Ashesh; Mehta, Nihal
2015-03-01
Recent advances in internet browser technologies makes it possible to incorporate advanced functionality of a traditional PACS for viewing DICOM medical images on standard web browsers without the need to pre-install any plug-ins, apps or software. We demonstrate some of the capabilities of standard web browsers setting the stage for a cloud-based PACS.
Web3D Technologies in Learning, Education and Training: Motivations, Issues, Opportunities
ERIC Educational Resources Information Center
Chittaro, Luca; Ranon, Roberto
2007-01-01
Web3D open standards allow the delivery of interactive 3D virtual learning environments through the Internet, reaching potentially large numbers of learners worldwide, at any time. This paper introduces the educational use of virtual reality based on Web3D technologies. After briefly presenting the main Web3D technologies, we summarize the…
How Accessible Are Public Libraries' Web Sites? A Study of Georgia Public Libraries
ERIC Educational Resources Information Center
Ingle, Emma; Green, Ravonne A.; Huprich, Julia
2009-01-01
One issue that public librarians must consider when planning Web site design is accessibility for patrons with disabilities. This article reports a study of Web site accessibility of public libraries in Georgia. The focus of the report is whether public libraries use accessible guidelines and standards in making their Web sites accessible. An…
ERIC Educational Resources Information Center
McRae, Christopher; Karuso, Peter; Liu, Fei
2012-01-01
The Web is now a standard tool for information access and dissemination in higher education. The prospect of Web-based, simulated learning platforms and technologies, however, remains underexplored. We have developed a Web-based tutorial program (ChemVoyage) for a third-year organic chemistry class on the topic of pericyclic reactions to…
40 CFR 63.3300 - Which of my emission sources are affected by this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating... affected source subject to this subpart is the collection of all web coating lines at your facility. This includes web coating lines engaged in the coating of metal webs that are used in flexible packaging, and...
40 CFR 63.3300 - Which of my emission sources are affected by this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating... affected source subject to this subpart is the collection of all web coating lines at your facility. This includes web coating lines engaged in the coating of metal webs that are used in flexible packaging, and...
40 CFR 63.3300 - Which of my emission sources are affected by this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating... affected source subject to this subpart is the collection of all web coating lines at your facility. This includes web coating lines engaged in the coating of metal webs that are used in flexible packaging, and...
Going, going, still there: using the WebCite service to permanently archive cited web pages.
Eysenbach, Gunther; Trudel, Mathieu
2005-12-30
Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics.
The interoperability skill of the Geographic Portal of the ISPRA - Geological Survey of Italy
NASA Astrophysics Data System (ADS)
Pia Congi, Maria; Campo, Valentina; Cipolloni, Carlo; Delogu, Daniela; Ventura, Renato; Battaglini, Loredana
2010-05-01
The Geographic Portal of Geological Survey of Italy (ISPRA) available at http://serviziogeologico.apat.it/Portal was planning according to standard criteria of the INSPIRE directive. ArcIMS services and at the same time WMS and WFS services had been realized to satisfy the different clients. For each database and web-services the metadata had been wrote in agreement with the ISO 19115. The management architecture of the portal allow it to encode the clients input and output requests both in ArcXML and in GML language. The web-applications and web-services had been realized for each database owner of Land Protection and Georesources Department concerning the geological map at the scale 1:50.000 (CARG Project) and 1:100.000, the IFFI landslide inventory, the boreholes due Law 464/84, the large-scale geological map and all the raster format maps. The portal thus far published is at the experimental stage but through the development of a new graphical interface achieves the final version. The WMS and WFS services including metadata will be re-designed. The validity of the methodology and the applied standards allow to look ahead to the growing developments. In addition to this it must be borne in mind that the capacity of the new geological standard language (GeoSciML), which is already incorporated in the web-services deployed, will be allow a better display and query of the geological data according to the interoperability. The characteristics of the geological data demand for the cartographic mapping specific libraries of symbols not yet available in a WMS service. This is an other aspect regards the standards of the geological informations. Therefore at the moment were carried out: - a library of geological symbols to be used for printing, with a sketch of system colors and a library for displaying data on video, which almost completely solves the problems of the coverage point and area data (also directed) but that still introduces problems for the linear data (solutions: ArcIMS services from Arcmap projects or a specific SLD implementation for WMS services); - an update of "Guidelines for the supply of geological data" in a short time will be published; - the Geological Survey of Italy is officially involved in the IUGS-CGI working group for the processing and experimentation on the new GeoSciML language with the WMS/WFS services. The availability of geographic informations occurs through the metadata that can be distributed online so that search engines can find them through specialized research. The collected metadata in catalogs are structured in a standard (ISO 19135). The catalogs are a ‘common' interface to locate, view and query data and metadata services, web services and other resources. Then, while working in a growing sector of the environmental knowledgement the focus is to collect the participation of other subjects that contribute to the enrichment of the informative content available, so as to be able to arrive to a real portal of national interest especially in case of disaster management.
ESB-based Sensor Web integration for the prediction of electric power supply system vulnerability.
Stoimenov, Leonid; Bogdanovic, Milos; Bogdanovic-Dinic, Sanja
2013-08-15
Electric power supply companies increasingly rely on enterprise IT systems to provide them with a comprehensive view of the state of the distribution network. Within a utility-wide network, enterprise IT systems collect data from various metering devices. Such data can be effectively used for the prediction of power supply network vulnerability. The purpose of this paper is to present the Enterprise Service Bus (ESB)-based Sensor Web integration solution that we have developed with the purpose of enabling prediction of power supply network vulnerability, in terms of a prediction of defect probability for a particular network element. We will give an example of its usage and demonstrate our vulnerability prediction model on data collected from two different power supply companies. The proposed solution is an extension of the GinisSense Sensor Web-based architecture for collecting, processing, analyzing, decision making and alerting based on the data received from heterogeneous data sources. In this case, GinisSense has been upgraded to be capable of operating in an ESB environment and combine Sensor Web and GIS technologies to enable prediction of electric power supply system vulnerability. Aside from electrical values, the proposed solution gathers ambient values from additional sensors installed in the existing power supply network infrastructure. GinisSense aggregates gathered data according to an adapted Omnibus data fusion model and applies decision-making logic on the aggregated data. Detected vulnerabilities are visualized to end-users through means of a specialized Web GIS application.
ESB-Based Sensor Web Integration for the Prediction of Electric Power Supply System Vulnerability
Stoimenov, Leonid; Bogdanovic, Milos; Bogdanovic-Dinic, Sanja
2013-01-01
Electric power supply companies increasingly rely on enterprise IT systems to provide them with a comprehensive view of the state of the distribution network. Within a utility-wide network, enterprise IT systems collect data from various metering devices. Such data can be effectively used for the prediction of power supply network vulnerability. The purpose of this paper is to present the Enterprise Service Bus (ESB)-based Sensor Web integration solution that we have developed with the purpose of enabling prediction of power supply network vulnerability, in terms of a prediction of defect probability for a particular network element. We will give an example of its usage and demonstrate our vulnerability prediction model on data collected from two different power supply companies. The proposed solution is an extension of the GinisSense Sensor Web-based architecture for collecting, processing, analyzing, decision making and alerting based on the data received from heterogeneous data sources. In this case, GinisSense has been upgraded to be capable of operating in an ESB environment and combine Sensor Web and GIS technologies to enable prediction of electric power supply system vulnerability. Aside from electrical values, the proposed solution gathers ambient values from additional sensors installed in the existing power supply network infrastructure. GinisSense aggregates gathered data according to an adapted Omnibus data fusion model and applies decision-making logic on the aggregated data. Detected vulnerabilities are visualized to end-users through means of a specialized Web GIS application. PMID:23955435
Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things
Jazayeri, Mohammad Ali; Liang, Steve H. L.; Huang, Chih-Yuan
2015-01-01
Recently, researchers are focusing on a new use of the Internet called the Internet of Things (IoT), in which enabled electronic devices can be remotely accessed over the Internet. As the realization of IoT concept is still in its early stages, manufacturers of Internet-connected devices and IoT web service providers are defining their proprietary protocols based on their targeted applications. Consequently, IoT becomes heterogeneous in terms of hardware capabilities and communication protocols. Addressing these heterogeneities by following open standards is a necessary step to communicate with various IoT devices. In this research, we assess the feasibility of applying existing open standards on resource-constrained IoT devices. The standard protocols developed in this research are OGC PUCK over Bluetooth, TinySOS, SOS over CoAP, and OGC SensorThings API. We believe that by hosting open standard protocols on IoT devices, not only do the devices become self-describable, self-contained, and interoperable, but innovative applications can also be easily developed with standardized interfaces. In addition, we use memory consumption, request message size, response message size, and response latency to benchmark the efficiency of the implemented protocols. In all, this research presents and evaluates standard-based solutions to better understand the feasibility of applying existing standards to the IoT vision. PMID:26402683
Using Standardized Lexicons for Report Template Validation with LexMap, a Web-based Application.
Hostetter, Jason; Wang, Kenneth; Siegel, Eliot; Durack, Jeremy; Morrison, James J
2015-06-01
An enormous amount of data exists in unstructured diagnostic and interventional radiology reports. Free text or non-standardized terminologies limit the ability to parse, extract, and analyze these report data elements. Medical lexicons and ontologies contain standardized terms for relevant concepts including disease entities, radiographic technique, and findings. The use of standardized terms offers the potential to improve reporting consistency and facilitate computer analysis. The purpose of this project was to implement an interface to aid in the creation of standards-compliant reporting templates for use in interventional radiology. Non-standardized procedure report text was analyzed and referenced to RadLex, SNOMED-CT, and LOINC. Using JavaScript, a web application was developed which determined whether exact terms or synonyms in reports existed within these three reference resources. The NCBO BioPortal Annotator web service was used to map terms, and output from this application was used to create an interactive annotated version of the original report. The application was successfully used to analyze and modify five distinct reports for the Society of Interventional Radiology's standardized reporting project.
Embedding the shapes of regions of interest into a Clinical Document Architecture document.
Minh, Nguyen Hai; Yi, Byoung-Kee; Kim, Il Kon; Song, Joon Hyun; Binh, Pham Viet
2015-03-01
Sharing a medical image visually annotated by a region of interest with a remotely located specialist for consultation is a good practice. It may, however, require a special-purpose (and most likely expensive) system to send and view them, which is an unfeasible solution in developing countries such as Vietnam. In this study, we design and implement interoperable methods based on the HL7 Clinical Document Architecture and the eXtensible Markup Language Stylesheet Language for Transformation standards to seamlessly exchange and visually present the shapes of regions of interest using web browsers. We also propose a new integration architecture for a Clinical Document Architecture generator that enables embedding of regions of interest and simultaneous auto-generation of corresponding style sheets. Using the Clinical Document Architecture document and style sheet, a sender can transmit clinical documents and medical images together with coordinate values of regions of interest to recipients. Recipients can easily view the documents and display embedded regions of interest by rendering them in their web browser of choice. © The Author(s) 2014.
Applications of Dynamic Deployment of Services in Industrial Automation
NASA Astrophysics Data System (ADS)
Candido, Gonçalo; Barata, José; Jammes, François; Colombo, Armando W.
Service-oriented Architecture (SOA) is becoming a de facto paradigm for business and enterprise integration. SOA is expanding into several domains of application envisioning a unified solution suitable across all different layers of an enterprise infrastructure. The application of SOA based on open web standards can significantly enhance the interoperability and openness of those devices. By embedding a dynamical deployment service even into small field de- vices, it would be either possible to allow machine builders to place built- in services and still allow the integrator to deploy on-the-run the services that best fit his current application. This approach allows the developer to keep his own preferred development language, but still deliver a SOA- compliant application. A dynamic deployment service is envisaged as a fundamental framework to support more complex applications, reducing deployment delays, while increasing overall system agility. As use-case scenario, a dynamic deployment service was implemented over DPWS and WS-Management specifications allowing designing and programming an automation application using IEC61131 languages, and deploying these components as web services into devices.
Hashem, Ahmad; Ruggeri, Roberto
2003-01-01
Creating an integration-friendly infrastructure is the best way to prepare for current as well as next-generation POC devices. Fortunately, hassle-free system integration is needed throughout healthcare today for reasons beyond the domain of POC devices, so integration progress already made in other areas helps pave the way for POC devices. Healthcare IT professionals, as much or more than any other IT group, have had to deal with the challenges of integrating disparate systems. When considering deployment of POC devices, you will want to make sure they adhere to open industry standards. In this way, these important devices won't add to the problem of disparate systems, but will contribute to the solution. We see XML and Web services as being especially important to the future of healthcare delivery and administration. XML and Web services, along with powerful orchestrators, can provide an ever-richer collection of clinical data to be delivered to, and received from, the POC devices that will become an ever more important addition to the physician armamentarium.
NASA Astrophysics Data System (ADS)
Signell, R. P.; Camossi, E.
2015-11-01
Work over the last decade has resulted in standardized web-services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by: (1) making it simple for providers to enable web service access to existing output files; (2) using technology that is free, and that is easy to deploy and configure; and (3) providing tools to communicate with web services that work in existing research environments. We present a simple, local brokering approach that lets modelers continue producing custom data, but virtually aggregates and standardizes the data using NetCDF Markup Language. The THREDDS Data Server is used for data delivery, pycsw for data search, NCTOOLBOX (Matlab®1) and Iris (Python) for data access, and Ocean Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.1 Mention of trade names or commercial products does not constitute endorsement or recommendation for use by the US Government.
NASA Astrophysics Data System (ADS)
Jirka, Simon; del Rio, Joaquin; Toma, Daniel; Martinez, Enoc; Delory, Eric; Pearlman, Jay; Rieke, Matthes; Stasch, Christoph
2017-04-01
The rapidly evolving technology for building Web-based (spatial) information infrastructures and Sensor Webs, there are new opportunities to improve the process how ocean data is collected and managed. A central element in this development is the suite of Sensor Web Enablement (SWE) standards specified by the Open Geospatial Consortium (OGC). This framework of standards comprises on the one hand data models as well as formats for measurement data (ISO/OGC Observations and Measurement, O&M) and metadata describing measurement processes and sensors (OGC Sensor Model Language, SensorML). On the other hand the SWE standards comprise (Web service) interface specifications for pull-based access to observation data (OGC Sensor Observation Service, SOS) and for controlling or configuring sensors (OGC Sensor Planning Service, SPS). Also within the European INSPIRE framework the SWE standards play an important role as the SOS is the recommended download service interface for O&M-encoded observation data sets. In the context of the EU-funded Oceans of Tomorrow initiative the NeXOS (Next generation, Cost-effective, Compact, Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management) project is developing a new generation of in-situ sensors that make use of the SWE standards to facilitate the data publication process and the integration into Web based information infrastructures. This includes the development of a dedicated firmware for instruments and sensor platforms (SEISI, Smart Electronic Interface for Sensors and Instruments) maintained by the Universitat Politècnica de Catalunya (UPC). Among other features, SEISI makes use of OGC SWE standards such OGC-PUCK, to enable a plug-and-play mechanism for sensors based on SensorML encoded metadata. Thus, if a new instrument is attached to a SEISI-based platform, it automatically configures the connection to these instruments, automatically generated data files compliant with the ISO/OGC Observations and Measurements standard and initiates the data transmission into the NeXOS Sensor Web infrastructure. Besides these platform-related developments, NeXOS has realised the full path of data transmission from the sensor to the end user application. The conceptual architecture design is implemented by a series of open source SWE software packages provided by 52°North. This comprises especially different SWE server components (i.e. OGC Sensor Observation Service), tools for data visualisation (e.g. the 52°North Helgoland SOS viewer), and an editor for providing SensorML-based metadata (52°North smle). As a result, NeXOS has demonstrated how the SWE standards help to improve marine observation data collection. Within this presentation, we will present the experiences and findings of the NeXOS project and will provide recommendation for future work directions.
Web-Enabled Systems for Student Access.
ERIC Educational Resources Information Center
Harris, Chad S.; Herring, Tom
1999-01-01
California State University, Fullerton is developing a suite of server-based, Web-enabled applications that distribute the functionality of its student information system software to external customers without modifying the mainframe applications or databases. The cost-effective, secure, and rapidly deployable business solution involves using the…
Parker, Siddhartha; Zipursky, Jonathan; Ma, Helen; Baumblatt, Geri-Lynn; Siegel, Corey A
2018-07-01
Assess the impact of a web-based multimedia patient engagement program on patient anxiety, perception and knowledge about the colonoscopy in addition to procedure outcomes. The success of patients coming for a colonoscopy for colorectal cancer screening is dependent in part on patients' understanding of the preparation and of the procedure. Patients were randomized to use either our institution's standard preprocedure colonoscopy packet or a web-based multimedia patient engagement program (Emmi Solutions) before their scheduled procedure. On the day of colonoscopy, all participants completed a survey including questions to assess knowledge and perception of colonoscopy, in addition to the State Trait Anxiety Inventory. We also collected procedure data including medication doses and procedure time. Patients in the experimental group correctly answered knowledge questions (82%) more often than the control group (74%) (P=0.0003). More than half (58%) of patients in the experimental group felt this intervention reduced their anxiety about the procedure, and the State Trait Anxiety Inventory anxiety score was lower in the experimental group (P=0.026). Patients who viewed the program required less midazolam (3.66 vs. 4.46 mg, P=0.0035) and total procedure time was shorter (24.8 vs. 29 min, P=0.024). A web-based multimedia patient engagement program watched before colonoscopy decreased patient anxiety, medication requirements, and procedure time while increasing knowledge. This intervention could help patients understand and feel more comfortable about colonoscopy leading to increased screening rates while increasing efficiency and decreasing recovery time.
Publishing biomedical journals on the World-Wide Web using an open architecture model.
Shareck, E. P.; Greenes, R. A.
1996-01-01
BACKGROUND: In many respects, biomedical publications are ideally suited for distribution via the World-Wide Web, but economic concerns have prevented the rapid adoption of an on-line publishing model. PURPOSE: We report on our experiences with assisting biomedical journals in developing an online presence, issues that were encountered, and methods used to address these issues. Our approach is based on an open architecture that fosters adaptation and interconnection of biomedical resources. METHODS: We have worked with the New England Journal of Medicine (NEJM), as well as five other publishers. A set of tools and protocols was employed to develop a scalable and customizable solution for publishing journals on-line. RESULTS: In March, 1996, the New England Journal of Medicine published its first World-Wide Web issue. Explorations with other publishers have helped to generalize the model. CONCLUSIONS: Economic and technical issues play a major role in developing World-Wide Web publishing solutions. PMID:8947685
Beerthuizen, Thijs; Voorend-van Bergen, Sandra; van den Hout, Wilbert B; Vaessen-Verberne, Anja A; Brackel, Hein J; Landstra, Anneke M; van den Berg, Norbert J; de Jongste, Johan C; Merkus, Peter J; Pijnenburg, Mariëlle W; Sont, Jacob K
2016-07-01
In children with asthma, web-based monitoring and inflammation-driven therapy may lead to improved asthma control and reduction in medications. However, the cost-effectiveness of these monitoring strategies is yet unknown. We assessed the cost-effectiveness of web-based monthly monitoring and of 4-monthly monitoring of FENO as compared with standard care. An economic evaluation was performed alongside a randomised controlled multicentre trial with a 1-year follow-up. Two hundred and seventy-two children with asthma, aged 4-18 years, were randomised to one of three strategies. In standard care, treatment was adapted according to Asthma Control Test (ACT) at 4-monthly visits, in the web-based strategy also according to web-ACT at 1 month intervals, and in the FENO-based strategy according to ACT and FENO at 4-monthly visits. Outcome measures were patient utilities, healthcare costs, societal costs and incremental cost per quality-adjusted life year (QALY) gained. No statistically significant differences were found in QALYs and costs between the three strategies. The web-based strategy had 77% chance of being most cost-effective from a healthcare perspective at a willingness to pay a generally accepted €40 000/QALY. The FENO-based strategy had 83% chance of being most cost-effective at €40 000/QALY from a societal perspective. Economically, web-based monitoring was preferred from a healthcare perspective, while the FENO-based strategy was preferred from a societal perspective, although in QALYs and costs no statistically significant changes were found as compared with standard care. As clinical outcomes also favoured the web-based and FENO-based strategies, these strategies may be useful additions to standard care. Netherlands Trial Register (NTR1995). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kaneko, Masahiro; Kakinuma, Ryutaro; Moriyama, Noriyuki
2010-03-01
Diagnostic MDCT imaging requires a considerable number of images to be read. Moreover, the doctor who diagnoses a medical image is insufficient in Japan. Because of such a background, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis. We also have developed the teleradiology network system by using web medical image conference system. In the teleradiology network system, the security of information network is very important subjects. Our teleradiology network system can perform Web medical image conference in the medical institutions of a remote place using the web medical image conference system. We completed the basic proof experiment of the web medical image conference system with information security solution. We can share the screen of web medical image conference system from two or more web conference terminals at the same time. An opinion can be exchanged mutually by using a camera and a microphone that are connected with the workstation that builds in some diagnostic assistance methods. Biometric face authentication used on site of teleradiology makes "Encryption of file" and "Success in login" effective. Our Privacy and information security technology of information security solution ensures compliance with Japanese regulations. As a result, patients' private information is protected. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new teleradiology network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our teleradiology network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
Mattsson, Brady J; Fischborn, Marie; Brunson, Mark; Vacik, Harald
2018-03-30
Protected areas (PAs) can generate many benefits inside and outside their borders, and achieving objectives for diverse stakeholders raises many challenges. There are many examples of successful PA management around the globe, although a systematic and comprehensive approach to developing and sharing these solutions has been lacking. We present "solutioning" as a structured process of peer-learning, which can inform management strategies in and around protected areas. We explain how the PANORAMA-Solutions for a Healthy Planet initiative has put solutioning into practice through an interactive community and web portal to learn about protected area solutions around the globe. Unlike other web platforms and initiatives reviewed, PANORAMA facilitates adaptation of solution elements (i.e., building blocks) for novel implementation. Supported by theories of resilience and peer-learning, PANORAMA appears to have potential to promote efficiency and equitable benefits for PAs and associated stakeholders focused on nature conservation and sustainable development, although further research is needed to assess whether this learning leads to better solutions or more effective PA management.
An integrative solution for managing, tracing and citing sensor-related information
NASA Astrophysics Data System (ADS)
Koppe, Roland; Gerchow, Peter; Macario, Ana; Schewe, Ingo; Rehmcke, Steven; Düde, Tobias
2017-04-01
In a data-driven scientific world, the need to capture information on sensors used in the data acquisition process has become increasingly important. Following the recommendations of the Open Geospatial Consortium (OGC), we started by adopting the SensorML standard for describing platforms, devices and sensors. However, it soon became obvious to us that understanding, implementing and filling such standards costs significant effort and cannot be expected from every scientist individually. So we developed a web-based sensor management solution (https://sensor.awi.de) for describing platforms, devices and sensors as hierarchy of systems which supports tracing changes to a system whereas hiding complexity. Each platform contains devices where each device can have sensors associated with specific identifiers, contacts, events, related online resources (e.g. manufacturer factsheets, calibration documentation, data processing documentation), sensor output parameters and geo-location. In order to better understand and address real world requirements, we have closely interacted with field-going scientists in the context of the key national infrastructure project "FRontiers in Arctic marine Monitoring ocean observatory" (FRAM) during the software development. We learned that not only the lineage of observations is crucial for scientists but also alert services using value ranges, flexible output formats and information on data providers (e.g. FTP sources) for example. Mostly important, persistent and citable versions of sensor descriptions are required for traceability and reproducibility allowing seamless integration with existing information systems, e.g. PANGAEA. Within the context of the EU-funded Ocean Data Interoperability Platform project (ODIP II) and in cooperation with 52north we are proving near real-time data via Sensor Observation Services (SOS) along with sensor descriptions based on our sensor management solution. ODIP II also aims to develop a harmonized SensorML profile for the marine community which we will be adopting in our solution as soon as available. In this presentation we will show our sensor management solution which is embedded in our data flow framework to offer out-of-the-box interoperability with existing information systems and standards. In addition, we will present real world examples and challenges related to the description and traceability of sensor metadata.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-04
... Web site. E-mail: Comments may be sent by electronic mail (e-mail) to a-and-r[email protected] otherwise protected through http://www.regulations.gov or e-mail. The http://www.regulations.gov Web site is... Web site: http://www.epa.gov/airquality/combustion . Please refer to this Web site to confirm the date...
BPELPower—A BPEL execution engine for geospatial web services
NASA Astrophysics Data System (ADS)
Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi
2012-10-01
The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.
Developing Collections of Web-Published Materials
ERIC Educational Resources Information Center
Hsieh, Inga K.; Murray, Kathleen R.; Hartman, Cathy Nelson
2007-01-01
Librarians and archivists face challenges when adapting traditional collection development practices to meet the unique characteristics of Web-published materials. Likewise, preservation activities for Web-published materials must be undertaken at the outset of collection development lest they be lost forever. Standards and best practices for…
Improving Science Communication with Responsive Web Design
NASA Astrophysics Data System (ADS)
Hilverda, M.
2013-12-01
Effective science communication requires clarity in both content and presentation. Content is increasingly being viewed via the Web across a broad range of devices, which can vary in screen size, resolution, and pixel density. Readers access the same content from desktop computers, tablets, smartphones, and wearable computing devices. Creating separate presentation formats optimized for each device is inefficient and unrealistic as new devices continually enter the marketplace. Responsive web design is an approach that puts content first within a presentation design that responds automatically to its environment. This allows for one platform to be maintained that can be used effectively for every screen. The layout adapts to screens of all sizes ensuring easy viewing of content for readers regardless of their device. Responsive design is accomplished primarily by the use of media queries within style sheets, which allows for changes to layout properties to be defined based on media types (i.e. screen, print) and resolution. Images and other types of multimedia can also be defined to scale automatically to fit different screen dimensions, although some media types require additional effort for proper implementation. Hardware changes, such as high pixel density screens, also present new challenges for effective presentation of content. High pixel density screens contain a greater number of pixels within a screen area increasing the pixels per inch (PPI) compared to standard screens. The result is increased clarity for text and vector media types, but often decreased clarity for standard resolution raster images. Media queries and other custom solutions can assist by specifying higher resolution images for high pixel density screens. Unfortunately, increasing image resolution results in significantly more data being transferred to the device. Web traffic on mobile devices such as smartphones and tablets is on a steady growth trajectory and many mobile devices around the world use low-bandwidth connections. Communicating science effectively includes efficient delivery of the information to the reader. To meet this criteria, responsive designs should also incorporate "mobile first" elements such as serving ideal image sizes (a low resolution cell phone does not need to receive a large desktop image) and a focus on fast, readable content delivery. The technical implementation of responsive web design is constantly changing as new web standards and approaches become available. However, fundamental design principles such as grid layouts, clear typography, and proper use of white space should be an important part of content delivery within any responsive design. This presentation will discuss current responsive design approaches for improving scientific communication across multiple devices, operating systems, and bandwidth capacities. The presentation will also include example responsive designs for scientific papers and websites. Implementing a responsive design approach with a focus on content and fundamental design principles is an important step to ensuring scientific information remains clear and accessible as screens and devices continue to evolve.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-17
.... EPA-HQ-OAR-2002-0037. All documents in the docket are listed on the http://www.regulations.gov Web... voluntary consensus standards VOC volatile organic compound WWW World Wide Web Organization of This Document. The following outline is provided to aid in locating information in this preamble. I. General...
The Role of Faculty in the Effectiveness of Fully Online Programs
ERIC Educational Resources Information Center
Al-Salman, Sami M.
2013-01-01
The enormous growth of online learning creates the need to develop a set of standards and guidelines for fully online programs. While many guidelines do exist, web-based programs still fall short in the recognition, adoption, or the implementation of these standards. One consequence is the high attrition rates associated with web-based distance…
ERIC Educational Resources Information Center
Taylor, Arthur; Dalal, Heather A.
2014-01-01
Introduction: This paper aims to determine how appropriate information literacy instruction is for preparing students for these unmediated searches using commercial search engines and the Web. Method. A survey was designed using the 2000 Association of College and Research Libraries literacy competency standards for higher education. Survey…
C-C1-04: Building a Health Services Information Technology Research Environment
Gehrum, David W; Jones, JB; Romania, Gregory J; Young, David L; Lerch, Virginia R; Bruce, Christa A; Donkochik, Diane; Stewart, Walter F
2010-01-01
Background: The electronic health record (EHR) has opened a new era for health services research (HSR) where information technology (IT) is used to re-engineer care processes. While the EHR provides one means of advancing novel solutions, a promising strategy is to develop tools (e.g., online questionnaires, visual display tools, decision support) distinct from, but which interact with, the EHR. Development of such software tools outside the EHR offers an advantage in flexibility, sophistication, and ultimately in portability to other settings. However, institutional IT departments have an imperative to protect patient data and to standardize IT processes to ensure system-level security and support traditional business needs. Such imperatives usually present formidable process barriers to testing novel software solutions. We describe how, in collaboration with our IT department, we are creating an environment and a process that allows for routine and rapid testing of novel software solutions. Methods: We convened a working group consisting of IT and research personnel with expertise in information security, database design/management, web design, EHR programming, and health services research. The working group was tasked with developing a research IT environment to accomplish two objectives: maintain network/ data security and regulatory compliance; allow researchers working with external vendors to rapidly prototype and, in a clinical setting, test web-based tools. Results: Two parallel solutions, one focused on hardware, the second on oversight and management, were developed. First, we concluded that three separate, staged development environments were required to allow external vendor access for testing software and for transitioning software to be used in a clinic. In parallel, the extant oversight process for approving/managing access to internal/external personnel had to be altered to reflect the scope and scale of discrete research projects, as opposed to an enterpriselevel approach to IT management. Conclusions: Innovation in health services software development requires a flexible, scalable IT environment adapted to the unique objectives of a HSR software development model. In our experience, implementing the hardware solution is less challenging than the cultural change required to implement such a model and the modifications to administrative and oversight processes to sustain an environment for rapid product development and testing.
NASA Astrophysics Data System (ADS)
Wang, J.; Song, J.; Gao, M.; Zhu, L.
2014-02-01
The trans-boundary area between Northern China, Mongolia and eastern Siberia of Russia is a continuous geographical area located in north eastern Asia. Many common issues in this region need to be addressed based on a uniform resources and environmental data warehouse. Based on the practice of joint scientific expedition, the paper presented a data integration solution including 3 steps, i.e., data collection standards and specifications making, data reorganization and process, data warehouse design and development. A series of data collection standards and specifications were drawn up firstly covering more than 10 domains. According to the uniform standard, 20 resources and environmental survey databases in regional scale, and 11 in-situ observation databases were reorganized and integrated. North East Asia Resources and Environmental Data Warehouse was designed, which included 4 layers, i.e., resources layer, core business logic layer, internet interoperation layer, and web portal layer. The data warehouse prototype was developed and deployed initially. All the integrated data in this area can be accessed online.
OpenID Connect as a security service in cloud-based medical imaging systems
Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter
2016-01-01
Abstract. The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as “Kerberos of cloud.” We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model. PMID:27340682
Egbring, Marco; Kullak-Ublick, Gerd A; Russmann, Stefan
2010-01-01
To develop a software solution that supports management and clinical review of patient data from electronic medical records databases or claims databases for pharmacoepidemiological drug safety studies. We used open source software to build a data management system and an internet application with a Flex client on a Java application server with a MySQL database backend. The application is hosted on Amazon Elastic Compute Cloud. This solution named Phynx supports data management, Web-based display of electronic patient information, and interactive review of patient-level information in the individual clinical context. This system was applied to a dataset from the UK General Practice Research Database (GPRD). Our solution can be setup and customized with limited programming resources, and there is almost no extra cost for software. Access times are short, the displayed information is structured in chronological order and visually attractive, and selected information such as drug exposure can be blinded. External experts can review patient profiles and save evaluations and comments via a common Web browser. Phynx provides a flexible and economical solution for patient-level review of electronic medical information from databases considering the individual clinical context. It can therefore make an important contribution to an efficient validation of outcome assessment in drug safety database studies.
Semantic-Web Architecture for Electronic Discharge Summary Based on OWL 2.0 Standard.
Tahmasebian, Shahram; Langarizadeh, Mostafa; Ghazisaeidi, Marjan; Safdari, Reza
2016-06-01
Patients' electronic medical record contains all information related to treatment processes during hospitalization. One of the most important documents in this record is the record summary. In this document, summary of the whole treatment process is presented which is used for subsequent treatments and other issues pertaining to the treatment. Using suitable architecture for this document, apart from the aforementioned points we can use it in other fields such as data mining or decision making based on the cases. In this study, at first, a model for patient's medical record summary has been suggested using semantic web-based architecture. Then, based on service-oriented architecture and using Java programming language, a software solution was designed and run in a way to generate medical record summary with this structure and at the end, new uses of this structure was explained. in this study a structure for medical record summaries along with corrective points within semantic web has been offered and a software running within Java along with special ontologies are provided. After discussing the project with the experts of medical/health data management and medical informatics as well as clinical experts, it became clear that suggested design for medical record summary apart from covering many issues currently faced in the medical records has also many advantages including its uses in research projects, decision making based on the cases etc.
Using the World Wide WEB to promote science education in nuclear energy and RWM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, M.
1996-12-31
A priority of government and business in the United States and other first tier industrial countries continues to be the improvement of science, mathematics and technology (SMT) instruction in pre university level education. The U.S. federal government has made SMT instruction an educational priority and set goals for improving it in the belief that science, math and technology education are tied to our economic well being and standard of living. The new national standards in mathematics education, science education and the proposed standards in technology education are all aimed at improving knowledge and skills in the essential areas that themore » federal government considers important for protecting our technological advantage in the world economy. This paper will discuss a pilot project for establishing graphical Web capability in a limited number of rural Nevada schools (six) with support from the US Department of Energy (DOE) and the state of Nevada. The general goals of the pilot project are as follows: (1) to give rural teachers and students access to up to date science information on the Web; (2) to determine whether Web access can improve science teaching and student attitudes toward science in rural Nevada schools; and (3) to identify science content on the Web that supports the National Science Standards and Benchmarks. A specific objective that this paper will address is stated as the following question: What potential do nuclear energy information office web sites offer for changing student attitudes about nuclear energy and creating greater nuclear literacy.« less
Design Drivers of Water Data Services
NASA Astrophysics Data System (ADS)
Valentine, D.; Zaslavsky, I.
2008-12-01
The CUAHSI Hydrologic Information System (HIS) is being developed as a geographically distributed network of hydrologic data sources and functions that are integrated using web services so that they function as a connected whole. The core of the HIS service-oriented architecture is a collection of water web services, which provide uniform access to multiple repositories of observation data. These services use SOAP protocols communicating WaterML (Water Markup Language). When a client makes a data or metadata request using a CUAHSI HIS web service, these requests are made in standard manner, following the CUAHSI HIS web service signatures - regardless of how the underlying data source may be organized. Also, regardless of the format in which the data are returned by the source, the web services respond to requests by returning the data in a standard format of WaterML. The goal of WaterML design has been to capture semantics of hydrologic observations discovery and retrieval and express the point observations information model as an XML schema. To a large extent, it follows the representation of the information model as adopted by the CUASHI Observations Data Model (ODM) relational design. Another driver of WaterML design is specifications and metadata adopted by USGS NWIS, EPA STORET, and other federal agencies, as it seeks to provide a common foundation for exchanging both agency data and data collected in multiple academic projects. Another WaterML design principle was to create, in version 1 of HIS in particular, a fairly rigid and simple XML schema which is easy to generate and parse, thus creating the least barrier for adoption by hydrologists. WaterML includes a series of elements that reflect common notions used in describing hydrologic observations, such as site, variable, source, observation series, seriesCatalog, and data values. Each of the three main request methods in the water web services - GetSiteInfo, GetVariableInfo, and GetValues - has a corresponding response element in WaterML: SitesResponse, VariableResponse, and TimeSeriesResponse. The WaterML specification is being adopted by federal agencies. The experimental USGS NWIS Daily Values web service returns WaterML-compliant TImeSeriesResponse. The National Climatic Data Center is also prototyping WaterML for data delivery, and has developed a REST-based service that generates WaterML- compliant output for the NCDC ASOS network. Such agency-supported web services coming online provide a much more efficient way to deliver agency data compared to the web site scraper services that the CUAHSI HIS project has developed initially. The CUAHSI water data web services will continue to serve as the main communication mechanism within CUAHSI HIS, connecting a variety of data sources with a growing set of web service clients being developed in both academia and the commercial sector. The driving forces for the development of web services continue to be: - Application experience and needs of the growing number of CUAHSI HIS users, who experiment with additional data types, analysis modes, data browsing and searching strategies, and provide feedback to WaterML developers; - Data description requirements posed by various federal and state agencies; - Harmonization with standards being adopted or developed in neighboring communities, in particular the relevant standards being explored within the Open Geospatial Consortium. CUAHSI WaterML is a standard output schema for CUAHSI HIS water web services. Its formal specification is available as OGC discussion paper at www.opengeospatial.org/standards/dp/ class="ab'>
Operational Marine Data Acquisition and Delivery Powered by Web and Geospatial Standards
NASA Astrophysics Data System (ADS)
Thomas, R.; Buck, J. J. H.
2015-12-01
As novel sensor types and new platforms are deployed to monitor the global oceans, the volumes of scientific and environmental data collected in the marine context are rapidly growing. In order to use these data in both the traditional operational modes and in innovative "Big Data" applications the data must be readily understood by software agents. One approach to achieving this is the application of both World Wide Web and Open Geospatial Consortium standards: namely Linked Data1 and Sensor Web Enablement2 (SWE). The British Oceanographic Data Centre (BODC) is adopting this strategy in a number of European Commission funded projects (NETMAR; SenseOCEAN; Ocean Data Interoperability Platform - ODIP; and AtlantOS) to combine its existing data archiving architecture with SWE components (such as Sensor Observation Services) and a Linked Data interface. These will evolve the data management and data transfer from a process that requires significant manual intervention to an automated operational process enabling the rapid, standards-based, ingestion and delivery of data. This poster will show the current capabilities of BODC and the status of on-going implementation of this strategy. References1. World Wide Web Consortium. (2013). Linked Data. Available:http://www.w3.org/standards/semanticweb/data. Last accessed 7th April 20152. Open Geospatial Consortium. (2014). Sensor Web Enablement (SWE). Available:http://www.opengeospatial.org/ogc/markets-technologies/swe. Last accessed 8th October 2014
Research of marine sensor web based on SOA and EDA
NASA Astrophysics Data System (ADS)
Jiang, Yongguo; Dou, Jinfeng; Guo, Zhongwen; Hu, Keyong
2015-04-01
A great deal of ocean sensor observation data exists, for a wide range of marine disciplines, derived from in situ and remote observing platforms, in real-time, near-real-time and delayed mode. Ocean monitoring is routinely completed using sensors and instruments. Standardization is the key requirement for exchanging information about ocean sensors and sensor data and for comparing and combining information from different sensor networks. One or more sensors are often physically integrated into a single ocean `instrument' device, which often brings in many challenges related to diverse sensor data formats, parameters units, different spatiotemporal resolution, application domains, data quality and sensors protocols. To face these challenges requires the standardization efforts aiming at facilitating the so-called Sensor Web, which making it easy to provide public access to sensor data and metadata information. In this paper, a Marine Sensor Web, based on SOA and EDA and integrating the MBARI's PUCK protocol, IEEE 1451 and OGC SWE 2.0, is illustrated with a five-layer architecture. The Web Service layer and Event Process layer are illustrated in detail with an actual example. The demo study has demonstrated that a standard-based system can be built to access sensors and marine instruments distributed globally using common Web browsers for monitoring the environment and oceanic conditions besides marine sensor data on the Web, this framework of Marine Sensor Web can also play an important role in many other domains' information integration.
Automatic Generation of Data Types for Classification of Deep Web Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ngu, A H; Buttler, D J; Critchlow, T J
2005-02-14
A Service Class Description (SCD) is an effective meta-data based approach for discovering Deep Web sources whose data exhibit some regular patterns. However, it is tedious and error prone to create an SCD description manually. Moreover, a manually created SCD is not adaptive to the frequent changes of Web sources. It requires its creator to identify all the possible input and output types of a service a priori. In many domains, it is impossible to exhaustively list all the possible input and output data types of a source in advance. In this paper, we describe machine learning approaches for automaticmore » generation of the data types of an SCD. We propose two different approaches for learning data types of a class of Web sources. The Brute-Force Learner is able to generate data types that can achieve high recall, but with low precision. The Clustering-based Learner generates data types that have a high precision rate, but with a lower recall rate. We demonstrate the feasibility of these two learning-based solutions for automatic generation of data types for citation Web sources and presented a quantitative evaluation of these two solutions.« less
Quality of Web-Based Information on Cannabis Addiction
ERIC Educational Resources Information Center
Khazaal, Yasser; Chatton, Anne; Cochand, Sophie; Zullino, Daniele
2008-01-01
This study evaluated the quality of Web-based information on cannabis use and addiction and investigated particular content quality indicators. Three keywords ("cannabis addiction," "cannabis dependence," and "cannabis abuse") were entered into two popular World Wide Web search engines. Websites were assessed with a standardized proforma designed…
XML Content Finally Arrives on the Web!
ERIC Educational Resources Information Center
Funke, Susan
1998-01-01
Explains extensible markup language (XML) and how it differs from hypertext markup language (HTML) and standard generalized markup language (SGML). Highlights include features of XML, including better formatting of documents, better searching capabilities, multiple uses for hyperlinking, and an increase in Web applications; Web browsers; and what…
Using Context to Assist in Personal File Retrieval
2006-08-25
of this work, filled in many of the gaps in my knowledge , and helped steer me toward solutions. Anind Dey was also invaluable in helping me design...like a personal assistant. Unfortunately, we are far from this ideal today. In fact, information management is one of the largest problems in...world wide web The world wide web is, perhaps, the largest distributed naming system in existence. To help manage this namespace, the web combines a
Integrating the Web and continuous media through distributed objects
NASA Astrophysics Data System (ADS)
Labajo, Saul P.; Garcia, Narciso N.
1998-09-01
The Web has rapidly grown to become the standard for documents interchange on the Internet. At the same time the interest on transmitting continuous media flows on the Internet, and its associated applications like multimedia on demand, is also growing. Integrating both kinds of systems should allow building real hypermedia systems where all media objects can be linked from any other, taking into account temporal and spatial synchronization. A way to achieve this integration is using the Corba architecture. This is a standard for open distributed systems. There are also recent efforts to integrate Web and Corba systems. We use this architecture to build a service for distribution of data flows endowed with timing restrictions. We use to integrate it with the Web, by one side Java applets that can use the Corba architecture and are embedded on HTML pages. On the other side, we also benefit from the efforts to integrate Corba and the Web.
NASA Astrophysics Data System (ADS)
Kadow, Christopher; Illing, Sebastian; Schartner, Thomas; Ulbrich, Uwe; Cubasch, Ulrich
2017-04-01
Operationalization processes are important for Weather and Climate Services. Complex data and work flows need to be combined fast to fulfill the needs of service centers. Standards in data and software formats help in automatic solutions. In this study we show a software solution in between hindcasts, forecasts, and validation to be operationalized. Freva (see below) structures data and evaluation procedures and can easily be monitored. Especially in the development process of operationalized services, Freva supports scientists and project partners. The showcase of the decadal climate prediction project MiKlip (fona-miklip.de) shows such a complex development process. Different predictions, scientists input, tasks, and time evolving adjustments need to be combined to host precise climate informations in a web environment without losing track of its evolution. The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science. Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitation of the provision and usage of tools and climate data automatically increases the number of scientists working with the data sets and identifying discrepancies. The integrated webshell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gateto the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database ofthe user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Therefore, plugged-in tools benefit from transparency and reproducibility. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.
Bernal-Rusiel, Jorge L.; Rannou, Nicolas; Gollub, Randy L.; Pieper, Steve; Murphy, Shawn; Robertson, Richard; Grant, Patricia E.; Pienaar, Rudolph
2017-01-01
In this paper we present a web-based software solution to the problem of implementing real-time collaborative neuroimage visualization. In both clinical and research settings, simple and powerful access to imaging technologies across multiple devices is becoming increasingly useful. Prior technical solutions have used a server-side rendering and push-to-client model wherein only the server has the full image dataset. We propose a rich client solution in which each client has all the data and uses the Google Drive Realtime API for state synchronization. We have developed a small set of reusable client-side object-oriented JavaScript modules that make use of the XTK toolkit, a popular open-source JavaScript library also developed by our team, for the in-browser rendering and visualization of brain image volumes. Efficient realtime communication among the remote instances is achieved by using just a small JSON object, comprising a representation of the XTK image renderers' state, as the Google Drive Realtime collaborative data model. The developed open-source JavaScript modules have already been instantiated in a web-app called MedView, a distributed collaborative neuroimage visualization application that is delivered to the users over the web without requiring the installation of any extra software or browser plugin. This responsive application allows multiple physically distant physicians or researchers to cooperate in real time to reach a diagnosis or scientific conclusion. It also serves as a proof of concept for the capabilities of the presented technological solution. PMID:28507515
78 FR 74162 - Draft Criminal Justice Offender Tracking System Standard and Companion Documents
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-10
... to the following Web site: https://www.justnet.org/standards/Offender_Tracking_Standards.html . DATES....org/standards/Offender_Tracking_Standards.html . Gregory K. Ridgeway, Acting Director, National...
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-08-21
NREL's Developer Network, developer.nrel.gov, provides data that users can access to provide data to their own analyses, mobile and web applications. Developers can retrieve the data through a Web services API (application programming interface). The Developer Network handles overhead of serving up web services such as key management, authentication, analytics, reporting, documentation standards, and throttling in a common architecture, while allowing web services and APIs to be maintained and managed independently.
QNAP 1263U Network Attached Storage (NAS)/ Storage Area Network (SAN) Device Users Guide
2016-11-01
standard Ethernet network. Operating either a NAS or SAN is vital for the integrity of the data stored on the drives found in the device. Redundant...speed of the network itself. Many standards are in place for transferring data, including more standard ones such as File Transfer Protocol and Server ...following are the procedures for connecting to the NAS administrative web page: 1) Open a web browser and browse to 192.168.40.8:8080. 2) Enter the
Townley, Mark A; Pu, Qinglin; Zercher, Charles K; Neefus, Christopher D; Tillinghast, Edward K
2012-10-01
In northeastern North America, Zygiella atrica often build their orb webs near the ocean. We analyzed individual field-built Z. atrica webs to determine if organic low-molecular-mass solutes (LMM) in their sticky droplets showed any unusual features not previously seen in orb webs of other species living in less salty environments. While two of the three most abundant organic LMM (putrescine (butane-1,4-diamine) and GABamide (4-aminobutanamide)) are already well-known from webs of inland spiders, the third major LMM, β-alaninamide (3-aminopropanamide), a homolog of GABamide, has not been detected in sticky droplets from any other araneoid spiders (27 species). It remains to be established, however, whether or not use of β-alaninamide is related to proximity to saltwater. We observed variability in organic LMM composition in Z. atrica webs that appeared to be influenced more by an undetermined factor associated with different collecting locations and/or collection dates than by different genders or instars. Shifts in composition when adult females were transferred from the field to the laboratory were also observed. Structural similarities and inverse correlations among β-alaninamide, GABamide, and N-acetylputrescine suggest that they may form a series of LMM fulfilling essentially the same, as yet unknown, role in the webs of those species in which they occur. Copyright © 2012 Verlag Helvetica Chimica Acta AG, Zürich.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-19
... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Federal Cyber Service: Scholarship for Service (SFS) Registration Web Site AGENCY: U.S. Office of Personnel Management. ACTION: 60-Day Notice and request for comments. SUMMARY: The Human Resources Solutions, Office of Personnel Management (OPM) offers...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-17
... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Federal Cyber Service: Scholarship for Service (SFS) Registration Web Site AGENCY: Office of Personnel Management. ACTION: 30-Day Notice and request for comments. SUMMARY: The Office of Personnel Management (OPM), Human Resources Solutions...
Rational Analyses of Information Foraging on the Web
ERIC Educational Resources Information Center
Pirolli, Peter
2005-01-01
This article describes rational analyses and cognitive models of Web users developed within information foraging theory. This is done by following the rational analysis methodology of (a) characterizing the problems posed by the environment, (b) developing rational analyses of behavioral solutions to those problems, and (c) developing cognitive…
NASA Astrophysics Data System (ADS)
Postpischl, L.; Morelli, A.; Danecek, P.
2009-04-01
Formats used to represent (and distribute) tomographic earth models differ considerably and are rarely self-consistent. In fact, each earth scientist, or research group, uses specific conventions to encode the various parameterizations used to describe, e.g., seismic wave speed or density in three dimensions, and complete information is often found in related documents or publications (if available at all) only. As a consequence, use of various tomographic models from different authors requires considerable effort, is more cumbersome than it should be and prevents widespread exchange and circulation within the community. We propose a format, based on modern web standards, able to represent different (grid-based) model parameterizations within the same simple text-based environment, easy to write, to parse, and to visualise. The aim is the creation of self-describing data-structures, both human and machine readable, that are automatically recognised by general-purpose software agents, and easily imported in the scientific programming environment. We think that the adoption of such a representation as a standard for the exchange and distribution of earth models can greatly ease their usage and enhance their circulation, both among fellow seismologists and among a broader non-specialist community. The proposed solution uses semantic web technologies, fully fitting the current trends in data accessibility. It is based on Json (JavaScript Object Notation), a plain-text, human-readable lightweight computer data interchange format, which adopts a hierarchical name-value model for representing simple data structures and associative arrays (called objects). Our implementation allows integration of large datasets with metadata (authors, affiliations, bibliographic references, units of measure etc.) into a single resource. It is equally suited to represent other geo-referenced volumetric quantities — beyond tomographic models — as well as (structured and unstructured) computational meshes. This approach can exploit the capabilities of the web browser as a computing platform: a series of in-page quick tools for comparative analysis between models will be presented, as well as visualisation techniques for tomographic layers in Google Maps and Google Earth. We are working on tools for conversion into common scientific format like netCDF, to allow easy visualisation in GEON-IDV or gmt.
Zare-Farashbandi, Firoozeh; Ramezan-Shirazi, Mahtab; Ashrafi-Rizi, Hasan; Nouri, Rasool
2014-01-01
Recent progress in providing innovative solutions in the organization of electronic resources and research in this area shows a global trend in the use of new strategies such as metadata to facilitate description, place for, organization and retrieval of resources in the web environment. In this context, library metadata standards have a special place; therefore, the purpose of the present study has been a comparative study on the Central Libraries' Websites of Iran State Universities for Hyper Text Mark-up Language (HTML) and Dublin Core metadata elements usage in 2011. The method of this study is applied-descriptive and data collection tool is the check lists created by the researchers. Statistical community includes 98 websites of the Iranian State Universities of the Ministry of Health and Medical Education and Ministry of Science, Research and Technology and method of sampling is the census. Information was collected through observation and direct visits to websites and data analysis was prepared by Microsoft Excel software, 2011. The results of this study indicate that none of the websites use Dublin Core (DC) metadata and that only a few of them have used overlaps elements between HTML meta tags and Dublin Core (DC) elements. The percentage of overlaps of DC elements centralization in the Ministry of Health were 56% for both description and keywords and, in the Ministry of Science, were 45% for the keywords and 39% for the description. But, HTML meta tags have moderate presence in both Ministries, as the most-used elements were keywords and description (56%) and the least-used elements were date and formatter (0%). It was observed that the Ministry of Health and Ministry of Science follows the same path for using Dublin Core standard on their websites in the future. Because Central Library Websites are an example of scientific web pages, special attention in designing them can help the researchers to achieve faster and more accurate information resources. Therefore, the influence of librarians' ideas on the awareness of web designers and developers will be important for using metadata elements as general, and specifically for applying such standards.
Zare-Farashbandi, Firoozeh; Ramezan-Shirazi, Mahtab; Ashrafi-Rizi, Hasan; Nouri, Rasool
2014-01-01
Introduction: Recent progress in providing innovative solutions in the organization of electronic resources and research in this area shows a global trend in the use of new strategies such as metadata to facilitate description, place for, organization and retrieval of resources in the web environment. In this context, library metadata standards have a special place; therefore, the purpose of the present study has been a comparative study on the Central Libraries’ Websites of Iran State Universities for Hyper Text Mark-up Language (HTML) and Dublin Core metadata elements usage in 2011. Materials and Methods: The method of this study is applied-descriptive and data collection tool is the check lists created by the researchers. Statistical community includes 98 websites of the Iranian State Universities of the Ministry of Health and Medical Education and Ministry of Science, Research and Technology and method of sampling is the census. Information was collected through observation and direct visits to websites and data analysis was prepared by Microsoft Excel software, 2011. Results: The results of this study indicate that none of the websites use Dublin Core (DC) metadata and that only a few of them have used overlaps elements between HTML meta tags and Dublin Core (DC) elements. The percentage of overlaps of DC elements centralization in the Ministry of Health were 56% for both description and keywords and, in the Ministry of Science, were 45% for the keywords and 39% for the description. But, HTML meta tags have moderate presence in both Ministries, as the most-used elements were keywords and description (56%) and the least-used elements were date and formatter (0%). Conclusion: It was observed that the Ministry of Health and Ministry of Science follows the same path for using Dublin Core standard on their websites in the future. Because Central Library Websites are an example of scientific web pages, special attention in designing them can help the researchers to achieve faster and more accurate information resources. Therefore, the influence of librarians’ ideas on the awareness of web designers and developers will be important for using metadata elements as general, and specifically for applying such standards. PMID:24741646
Agility: Agent - Ility Architecture
2002-10-01
existing and emerging standards (e.g., distributed objects, email, web, search engines , XML, Java, Jini). Three agent system components resulted from...agents and other Internet resources and operate over the web (AgentGram), a yellow pages service that uses Internet search engines to locate XML ads for agents and other Internet resources (WebTrader).
Going, Going, Still There: Using the WebCite Service to Permanently Archive Cited Web Pages
Trudel, Mathieu
2005-01-01
Scholars are increasingly citing electronic “web references” which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To “webcite” a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its “instructions for authors” accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) “prospectively” before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted “citing articles” (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics. PMID:16403724
Automatic Hidden-Web Table Interpretation by Sibling Page Comparison
NASA Astrophysics Data System (ADS)
Tao, Cui; Embley, David W.
The longstanding problem of automatic table interpretation still illudes us. Its solution would not only be an aid to table processing applications such as large volume table conversion, but would also be an aid in solving related problems such as information extraction and semi-structured data management. In this paper, we offer a conceptual modeling solution for the common special case in which so-called sibling pages are available. The sibling pages we consider are pages on the hidden web, commonly generated from underlying databases. We compare them to identify and connect nonvarying components (category labels) and varying components (data values). We tested our solution using more than 2,000 tables in source pages from three different domains—car advertisements, molecular biology, and geopolitical information. Experimental results show that the system can successfully identify sibling tables, generate structure patterns, interpret tables using the generated patterns, and automatically adjust the structure patterns, if necessary, as it processes a sequence of hidden-web pages. For these activities, the system was able to achieve an overall F-measure of 94.5%.
Web servicing the biological office.
Szugat, Martin; Güttler, Daniel; Fundel, Katrin; Sohler, Florian; Zimmer, Ralf
2005-09-01
Biologists routinely use Microsoft Office applications for standard analysis tasks. Despite ubiquitous internet resources, information needed for everyday work is often not directly and seamlessly available. Here we describe a very simple and easily extendable mechanism using Web Services to enrich standard MS Office applications with internet resources. We demonstrate its capabilities by providing a Web-based thesaurus for biological objects, which maps names to database identifiers and vice versa via an appropriate synonym list. The client application ProTag makes these features available in MS Office applications using Smart Tags and Add-Ins. http://services.bio.ifi.lmu.de/prothesaurus/
Creating Patient and Family Education Web Sites
YADRICH, DONNA MACAN; FITZGERALD, SHARON A.; WERKOWITCH, MARILYN; SMITH, CAROL E.
2013-01-01
This article gives details about the methods and processes used to ensure that usability and accessibility were achieved during development of the Home Parenteral Nutrition Family Caregivers Web site, an evidence-based health education Web site for the family members and caregivers of chronically ill patients. This article addresses comprehensive definitions of usability and accessibility and illustrates Web site development according to Section 508 standards and the national Health and Human Services’ Research-Based Web Design and Usability Guidelines requirements. PMID:22024970
Assessment of disease named entity recognition on a corpus of annotated sentences.
Jimeno, Antonio; Jimenez-Ruiz, Ernesto; Lee, Vivian; Gaudan, Sylvain; Berlanga, Rafael; Rebholz-Schuhmann, Dietrich
2008-04-11
In recent years, the recognition of semantic types from the biomedical scientific literature has been focused on named entities like protein and gene names (PGNs) and gene ontology terms (GO terms). Other semantic types like diseases have not received the same level of attention. Different solutions have been proposed to identify disease named entities in the scientific literature. While matching the terminology with language patterns suffers from low recall (e.g., Whatizit) other solutions make use of morpho-syntactic features to better cover the full scope of terminological variability (e.g., MetaMap). Currently, MetaMap that is provided from the National Library of Medicine (NLM) is the state of the art solution for the annotation of concepts from UMLS (Unified Medical Language System) in the literature. Nonetheless, its performance has not yet been assessed on an annotated corpus. In addition, little effort has been invested so far to generate an annotated dataset that links disease entities in text to disease entries in a database, thesaurus or ontology and that could serve as a gold standard to benchmark text mining solutions. As part of our research work, we have taken a corpus that has been delivered in the past for the identification of associations of genes to diseases based on the UMLS Metathesaurus and we have reprocessed and re-annotated the corpus. We have gathered annotations for disease entities from two curators, analyzed their disagreement (0.51 in the kappa-statistic) and composed a single annotated corpus for public use. Thereafter, three solutions for disease named entity recognition including MetaMap have been applied to the corpus to automatically annotate it with UMLS Metathesaurus concepts. The resulting annotations have been benchmarked to compare their performance. The annotated corpus is publicly available at ftp://ftp.ebi.ac.uk/pub/software/textmining/corpora/diseases and can serve as a benchmark to other systems. In addition, we found that dictionary look-up already provides competitive results indicating that the use of disease terminology is highly standardized throughout the terminologies and the literature. MetaMap generates precise results at the expense of insufficient recall while our statistical method obtains better recall at a lower precision rate. Even better results in terms of precision are achieved by combining at least two of the three methods leading, but this approach again lowers recall. Altogether, our analysis gives a better understanding of the complexity of disease annotations in the literature. MetaMap and the dictionary based approach are available through the Whatizit web service infrastructure (Rebholz-Schuhmann D, Arregui M, Gaudan S, Kirsch H, Jimeno A: Text processing through Web services: Calling Whatizit. Bioinformatics 2008, 24:296-298).
Savel, Craig; Mierzwa, Stan; Gorbach, Pamina M; Souidi, Samir; Lally, Michelle; Zimet, Gregory; Interventions, Aids
2016-01-01
This paper reports on a specific Web-based self-report data collection system that was developed for a public health research study in the United States. Our focus is on technical outcome results and lessons learned that may be useful to other projects requiring such a solution. The system was accessible from any device that had a browser that supported HTML5. Report findings include: which hardware devices, Web browsers, and operating systems were used; the rate of survey completion; and key considerations for employing Web-based surveys in a clinical trial setting.
NASA Astrophysics Data System (ADS)
Pesquer, Lluís; Jirka, Simon; van de Giesen, Nick; Masó, Joan; Stasch, Christoph; Van Nooyen, Ronald; Prat, Ester; Pons, Xavier
2015-04-01
This work describes the strategy of the European Horizon 2020 project WaterInnEU. Its vision is to enhance the exploitation of EU funded ICT models, tools, protocols and policy briefs related to the water sector and to establish suitable conditions for new market opportunities based on these offerings. The main goals are: • Connect the research results and developments of previous EU funded activities with the already existing data available on European level and also with to the companies that are able to offer products and services based on these tools and data. • Offer an independent marketplace platform complemented by technical and commercial expertise as a service for users to allow the access to products and services best fitting their priorities, capabilities and procurement processes. One of the pillars of WaterInnEU is to stimulate and prioritize the application of international standards into ICT tools and policy briefs. The standardization of formats, services and processes will allow for a harmonized water management between different sectors, fragmented areas and scales (local, regional or international) approaches. Several levels of interoperability will be addressed: • Syntactic: Connecting system and tools together: Syntactic interoperability allows for client and service tools to automatically discover, access, and process data and information (query and exchange parts of a database) and to connect each other in process chains. The discovery of water related data is achieved using metadata cataloguing standards and, in particular, the one adopted by the INSPIRE directive: OGC Catalogue Service for the Web (CSW). • Semantic: Sharing a pan-European conceptual framework This is the ability of computer systems to exchange data with unambiguous, shared meaning. The project therefore addresses not only the packaging of data (syntax), but also the simultaneous transmission of the meaning with the data (semantics). This is accomplished by linking each data element to a controlled, shared vocabulary. In Europe, INSPIRE defines a shared vocabulary and its associated links to an ontology. For hydrographical information this can be used as a baseline. • Organizational: Harmonizing policy aspects This level of interoperability deals with operational methodologies and procedures that organizations use to administrate their own data and processing capabilities and to share those capabilities with others. This layer is addressed by the adoption of common policy briefs that facilitate both robust protocols and flexibility to interact with others. • Data visualization: Making data easy to see The WMS and WMTS standards are the most commonly used geographic information visualization standards for sharing information in web portals. Our solution will incorporate a quality extension of these standards for visualizing data quality as nested layers linked to the different data sets. In the presented approach, the use of standards can be seen twofold: the tools and products should leverage standards wherever possible to ensure interoperability between solution providers, and the platform itself must utilize standards as much as possible, to allow for example the integration with other systems through open APIs or the description of available items.
Accountable Information Flow for Java-Based Web Applications
2010-01-01
runtime library Swift server runtime Java servlet framework HTTP Web server Web browser Figure 2: The Swift architecture introduced an open-ended...On the server, the Java application code links against Swift’s server-side run-time library, which in turn sits on top of the standard Java servlet ...AFRL-RI-RS-TR-2010-9 Final Technical Report January 2010 ACCOUNTABLE INFORMATION FLOW FOR JAVA -BASED WEB APPLICATIONS
A semantically rich and standardised approach enhancing discovery of sensor data and metadata
NASA Astrophysics Data System (ADS)
Kokkinaki, Alexandra; Buck, Justin; Darroch, Louise
2016-04-01
The marine environment plays an essential role in the earth's climate. To enhance the ability to monitor the health of this important system, innovative sensors are being produced and combined with state of the art sensor technology. As the number of sensors deployed is continually increasing,, it is a challenge for data users to find the data that meet their specific needs. Furthermore, users need to integrate diverse ocean datasets originating from the same or even different systems. Standards provide a solution to the above mentioned challenges. The Open Geospatial Consortium (OGC) has created Sensor Web Enablement (SWE) standards that enable different sensor networks to establish syntactic interoperability. When combined with widely accepted controlled vocabularies, they become semantically rich and semantic interoperability is achievable. In addition, Linked Data is the recommended best practice for exposing, sharing and connecting information on the Semantic Web using Uniform Resource Identifiers (URIs), Resource Description Framework (RDF) and RDF Query Language (SPARQL). As part of the EU-funded SenseOCEAN project, the British Oceanographic Data Centre (BODC) is working on the standardisation of sensor metadata enabling 'plug and play' sensor integration. Our approach combines standards, controlled vocabularies and persistent URIs to publish sensor descriptions, their data and associated metadata as 5 star Linked Data and OGC SWE (SensorML, Observations & Measurements) standard. Thus sensors become readily discoverable, accessible and useable via the web. Content and context based searching is also enabled since sensors descriptions are understood by machines. Additionally, sensor data can be combined with other sensor or Linked Data datasets to form knowledge. This presentation will describe the work done in BODC to achieve syntactic and semantic interoperability in the sensor domain. It will illustrate the reuse and extension of the Semantic Sensor Network (SSN) ontology to Linked Sensor Ontology (LSO) and the steps taken to combine OGC SWE with the Linked Data approach through alignment and embodiment of other ontologies. It will then explain how data and models were annotated with controlled vocabularies to establish unambiguous semantics and interconnect them with data from different sources. Finally, it will introduce the RDF triple store where the sensor descriptions and metadata are stored and can be queried through the standard query language SPARQL. Providing different flavours of machine readable interpretations of sensors, sensor data and metadata enhances discoverability but most importantly allows seamless aggregation of information from different networks that will finally produce knowledge.
An Open Source Tool to Test Interoperability
NASA Astrophysics Data System (ADS)
Bermudez, L. E.
2012-12-01
Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and description of performing local tests. It will also provide information about how to participate in the open source code development of TEAM Engine.
Verhoef, Willem A; Livas, Christos; Delli, Konstantina; Ren, Yijin
2015-05-01
The authors conducted this study to assess the quality of the information available on the Web about oral hygiene for patients with fixed orthodontic appliances. The authors entered the search terms "cleaning braces," "brushing braces," and "oral hygiene and braces" into Google, Yahoo, and Bing search engines. They analyzed Web sites satisfying the inclusion criteria from the first 20 hits of each search for accessibility, usability, and reliability by using the LIDA instrument; for readability by using the Flesch Reading Ease (FRE) score; and for the completeness of oral hygiene instructions. Sixty-two Web sites met the inclusion criteria. The mean total LIDA score of 71.2 indicated the moderate quality of the design of the reviewed Web sites. The mean (standard deviation [SD]) values of LIDA scores for accessibility, usability, and reliability were 85.9 (7.0), 63.4 (16.1), and 48.0 (10.4), respectively. The mean (SD) FRE Score of 68.6 (9.7) applied to standard reading skills. The completeness of information (mean [SD] = 67.1 [27.8]) presented the highest variability. Overall, the authors found that the standards of online oral hygiene materials for orthodontic patients with fixed appliances exhibited modest scores. Readability appeared to be appropriate for young adolescents, whereas the comprehensiveness of the displayed information was highly variable. Further improvement of the infrastructure of electronic health information (that is, e-health) in orthodontics is necessary to meet patients' needs. Given the moderate quality of oral hygiene instruction available on the Web for patients with fixed appliances, orthodontic patients and caregivers should be cautious when browsing the Internet for relevant information. Dental professionals should refer patients to valid Web-based educational materials. Copyright © 2015 American Dental Association. Published by Elsevier Inc. All rights reserved.
BioModels.net Web Services, a free and integrated toolkit for computational modelling software.
Li, Chen; Courtot, Mélanie; Le Novère, Nicolas; Laibe, Camille
2010-05-01
Exchanging and sharing scientific results are essential for researchers in the field of computational modelling. BioModels.net defines agreed-upon standards for model curation. A fundamental one, MIRIAM (Minimum Information Requested in the Annotation of Models), standardises the annotation and curation process of quantitative models in biology. To support this standard, MIRIAM Resources maintains a set of standard data types for annotating models, and provides services for manipulating these annotations. Furthermore, BioModels.net creates controlled vocabularies, such as SBO (Systems Biology Ontology) which strictly indexes, defines and links terms used in Systems Biology. Finally, BioModels Database provides a free, centralised, publicly accessible database for storing, searching and retrieving curated and annotated computational models. Each resource provides a web interface to submit, search, retrieve and display its data. In addition, the BioModels.net team provides a set of Web Services which allows the community to programmatically access the resources. A user is then able to perform remote queries, such as retrieving a model and resolving all its MIRIAM Annotations, as well as getting the details about the associated SBO terms. These web services use established standards. Communications rely on SOAP (Simple Object Access Protocol) messages and the available queries are described in a WSDL (Web Services Description Language) file. Several libraries are provided in order to simplify the development of client software. BioModels.net Web Services make one step further for the researchers to simulate and understand the entirety of a biological system, by allowing them to retrieve biological models in their own tool, combine queries in workflows and efficiently analyse models.
Cognetti, G; Cecere, L
2003-12-01
In 2002 the Italian Ministry of Health promoted the institution of a network and a web portal, E-oncology (2), for the seven NHS research institutions specialising in oncology (Istituti di Ricovero e Cura a Carattere Scientifico-IRCCS). One of the aims was to gather and provide information on tumoral pathologies to operators and the public. For an optimum organisation of a health web site it is necessary to comply with the standards internationally used. The World Wide Web Consortium (W3C) has developed guidelines for accessibility and usability of the sites, implemented in Italy through governmental issues. Many international organisations adopt rules and codes of conduct to validate biomedical information and have organised quality portals such as NLM, OMNI, MEDCIRCLE, HON etc. Some terminological standards, such as the MESH thesaurus and UMLS, have been produced by the libraries for a correct management and an effective information retrieval, and are currently used by the most important biomedical web sites. The Dublin Core, metadata standard for the integration of information deriving from heterogeneous archives, has also been developed by the libraries. The easy access to information dims the complex architecture necessary for the construction of a web site. The contribution of different professionals is necessary to guarantee the production of quality medical/health web sites, among them librarians have always been involved with the management of knowledge and their skills are extremely valuable. Furthermore, the libraries' network is essential in order to guarantee universal access to health information, mostly still against payment, and to contribute to overcoming the 'digital divide' and 'second-level digital divide'.
Giovanni in the Cloud: Earth Science Data Exploration in Amazon Web Services
NASA Astrophysics Data System (ADS)
Hegde, M.; Petrenko, M.; Smit, C.; Zhang, H.; Pilone, P.; Zasorin, A. A.; Pham, L.
2017-12-01
Giovanni (https://giovanni.gsfc.nasa.gov/giovanni/) is a popular online data exploration tool at the NASA Goddard Earth Sciences Data Information Services Center (GES DISC), providing 22 analysis and visualization services for over 1600 Earth Science data variables. Owing to its popularity, Giovanni has experienced a consistent growth in overall demand, with periodic usage spikes attributed to trainings by education organizations, extensive data analysis in response to natural disasters, preparations for science meetings, etc. Furthermore, the new generation of spaceborne sensors and high resolution models have resulted in an exponential growth in data volume with data distributed across the traditional boundaries of datacenters. Seamless exploration of data (without users having to worry about data center boundaries) has been a key recommendation of the GES DISC User Working Group. These factors have required new strategies for delivering acceptable performance. The cloud-based Giovanni, built on Amazon Web Services (AWS), evaluates (1) AWS native solutions to provide a scalable, serverless architecture; (2) open standards for data storage in the Cloud; (3) a cost model for operations; and (4) end-user performance. Our preliminary findings indicate that the use of serverless architecture has a potential to significantly reduce development and operational cost of Giovanni. The combination of using AWS managed services, storage of data in open standards, and schema-on-read data access strategy simplifies data access and analytics, in addition to making data more accessible to the end users of Giovanni through popular programming languages.
Giovanni in the Cloud: Earth Science Data Exploration in Amazon Web Services
NASA Technical Reports Server (NTRS)
Petrenko, Maksym; Hegde, Mahabal; Smit, Christine; Zhang, Hailiang; Pilone, Paul; Zasorin, Andrey A.; Pham, Long
2017-01-01
Giovanni is an exploration tool at the NASA Goddard Earth Sciences Data Information Services Center (GES DISC), providing 22 analysis and visualization services for over 1600 Earth Science data variables. Owing to its popularity, Giovanni has experienced a consistent growth in overall demand, with periodic usage spikes attributed to trainings by education organizations, extensive data analysis in response to natural disasters, preparations for science meetings, etc. Furthermore, the new generation of spaceborne sensors and high resolution models have resulted in an exponential growth in data volume with data distributed across the traditional boundaries of data centers. Seamless exploration of data (without users having to worry about data center boundaries) has been a key recommendation of the GES DISC User Working Group. These factors have required new strategies for delivering acceptable performance. The cloud-based Giovanni, built on Amazon Web Services (AWS), evaluates (1) AWS native solutions to provide a scalable, serverless architecture; (2) open standards for data storage in the Cloud; (3) a cost model for operations; and (4) end-user performance. Our preliminary findings indicate that the use of serverless architecture has a potential to significantly reduce development and operational cost of Giovanni. The combination of using AWS managed services, storage of data in open standards, and schema-on-read data access strategy simplifies data access and analytics, in addition to making data more accessible to the end users of Giovanni through popular programming languages.
NASA Astrophysics Data System (ADS)
Huang, C. Y.; Wu, C. H.
2016-06-01
The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring and physical mashup applications can be constructed to improve people's daily life. However, IoT devices created by different manufacturers follow different proprietary protocols and cannot communicate with each other. This heterogeneity issue causes different products to be locked in multiple closed ecosystems that we call IoT silos. In order to address this issue, a common industrial solution is the hub approach, which implements connectors to communicate with IoT devices following different protocols. However, with the growing number of proprietary protocols proposed by device manufacturers, IoT hubs need to support and maintain a lot of customized connectors. Hence, we believe the ultimate solution to address the heterogeneity issue is to follow open and interoperable standard. Among the existing IoT standards, the Open Geospatial Consortium (OGC) SensorThings API standard supports comprehensive conceptual model and query functionalities. The first version of SensorThings API mainly focuses on connecting to IoT devices and sharing sensor observations online, which is the sensing capability. Besides the sensing capability, IoT devices could also be controlled via the Internet, which is the tasking capability. While the tasking capability was not included in the first version of the SensorThings API standard, this research aims on defining the tasking capability profile and integrates with the SensorThings API standard, which we call the extended-SensorThings API in this paper. In general, this research proposes a lightweight JSON-based web service description, the "Tasking Capability Description", allowing device owners and manufacturers to describe different IoT device protocols. Through the extended- SensorThings API, users and applications can follow a coherent protocol to control IoT devices that use different communication protocols, which could consequently achieve the interoperable Internet of Things infrastructure.
NASA SensorWeb and OGC Standards for Disaster Management
NASA Technical Reports Server (NTRS)
Mandl, Dan
2010-01-01
I. Goal: Enable user to cost-effectively find and create customized data products to help manage disasters; a) On-demand; b) Low cost and non-specialized tools such as Google Earth and browsers; c) Access via open network but with sufficient security. II. Use standards to interface various sensors and resultant data: a) Wrap sensors in Open Geospatial Consortium (OGC) standards; b) Wrap data processing algorithms and servers with OGC standards c) Use standardized workflows to orchestrate and script the creation of these data; products. III. Target Web 2.0 mass market: a) Make it simple and easy to use; b) Leverage new capabilities and tools that are emerging; c) Improve speed and responsiveness.
18 CFR 284.13 - Reporting requirements for interstate pipelines.
Code of Federal Regulations, 2011 CFR
2011-04-01
... following information on its Internet web site, and provide the information in downloadable file formats, in... index of customers must also posted on the pipeline's Internet web, in accordance with standards adopted in § 284.12 of this part, and made available from the Internet web site in a downloadable format...
18 CFR 284.13 - Reporting requirements for interstate pipelines.
Code of Federal Regulations, 2014 CFR
2014-04-01
... following information on its Internet web site, and provide the information in downloadable file formats, in... index of customers must also posted on the pipeline's Internet web, in accordance with standards adopted in § 284.12 of this part, and made available from the Internet web site in a downloadable format...
18 CFR 284.13 - Reporting requirements for interstate pipelines.
Code of Federal Regulations, 2013 CFR
2013-04-01
... following information on its Internet web site, and provide the information in downloadable file formats, in... index of customers must also posted on the pipeline's Internet web, in accordance with standards adopted in § 284.12 of this part, and made available from the Internet web site in a downloadable format...
18 CFR 284.13 - Reporting requirements for interstate pipelines.
Code of Federal Regulations, 2012 CFR
2012-04-01
... following information on its Internet web site, and provide the information in downloadable file formats, in... index of customers must also posted on the pipeline's Internet web, in accordance with standards adopted in § 284.12 of this part, and made available from the Internet web site in a downloadable format...
18 CFR 284.13 - Reporting requirements for interstate pipelines.
Code of Federal Regulations, 2010 CFR
2010-04-01
... following information on its Internet web site, and provide the information in downloadable file formats, in... index of customers must also posted on the pipeline's Internet web, in accordance with standards adopted in § 284.12 of this part, and made available from the Internet web site in a downloadable format...
sTeam--Providing Primary Media Functions for Web-Based Computer-Supported Cooperative Learning.
ERIC Educational Resources Information Center
Hampel, Thorsten
The World Wide Web has developed as the de facto standard for computer based learning. However, as a server-centered approach, it confines readers and learners to passive nonsequential reading. Authoring and Web-publishing systems aim at supporting the authors' design process. Consequently, learners' activities are confined to selecting and…
Teaching Web Evaluation: A Cognitive Development Approach
ERIC Educational Resources Information Center
Benjes-Small, Candice; Archer, Alyssa; Tucker, Katelyn; Vassady, Lisa; Resor, Jennifer
2013-01-01
Web evaluation has been a standard information literacy offering for years and has always been a challenging topic for instruction librarians. Over time, the authors had tried a myriad of strategies to teach freshmen how to assess the credibility of Web sites but felt the efforts were insufficient. By familiarizing themselves with the cognitive…
Knowledge Base for Automatic Generation of Online IMS LD Compliant Course Structures
ERIC Educational Resources Information Center
Pacurar, Ecaterina Giacomini; Trigano, Philippe; Alupoaie, Sorin
2006-01-01
Our article presents a pedagogical scenarios-based web application that allows the automatic generation and development of pedagogical websites. These pedagogical scenarios are represented in the IMS Learning Design standard. Our application is a web portal helping teachers to dynamically generate web course structures, to edit pedagogical content…
40 CFR 63.3360 - What performance tests must I conduct?
Code of Federal Regulations, 2010 CFR
2010-07-01
... decimal point (for example, 0.763). (2) Method 24. For coatings, determine the volatile organic content as... National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating General Requirements... control organic HAP on any individual web coating line or any group of web coating lines by: You must: (1...
40 CFR 63.3360 - What performance tests must I conduct?
Code of Federal Regulations, 2011 CFR
2011-07-01
... decimal point (for example, 0.763). (2) Method 24. For coatings, determine the volatile organic content as... National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating General Requirements... control organic HAP on any individual web coating line or any group of web coating lines by: You must: (1...
Web Resources for Camp Staff: Where To Look for Answers to Your Questions.
ERIC Educational Resources Information Center
Pavlicin, Karen M.
1997-01-01
The World Wide Web is a good source of quick information, which is especially helpful during the busy camping season. Among the subjects on the Web relevant to camp are horsemanship, canoeing, waterfront safety, government standards, legislative news, disabilities, youth resources, vegetarian meals, grant writing, news, and stress management.…
Overview of the World Wide Web Consortium (W3C) (SIGs IA, USE).
ERIC Educational Resources Information Center
Daly, Janet
2000-01-01
Provides an overview of a planned session to describe the work of the World Wide Web Consortium, including technical specifications for HTML (Hypertext Markup Language), XML (Extensible Markup Language), CSS (Cascading Style Sheets), and over 20 other Web standards that address graphics, multimedia, privacy, metadata, and other technologies. (LRW)
A Web Service and Interface for Remote Electronic Device Characterization
ERIC Educational Resources Information Center
Dutta, S.; Prakash, S.; Estrada, D.; Pop, E.
2011-01-01
A lightweight Web Service and a Web site interface have been developed, which enable remote measurements of electronic devices as a "virtual laboratory" for undergraduate engineering classes. Using standard browsers without additional plugins (such as Internet Explorer, Firefox, or even Safari on an iPhone), remote users can control a Keithley…
Criteria for the Assessment of Foreign Language Instructional Software and Web Sites.
ERIC Educational Resources Information Center
Rifkin, Benjamin
2003-01-01
Presents standards for assessing language-learning software and Web sites in three different contexts: (1) teachers considering whether and how to integrate computer-mediated materials into their instruction; (2) specialists writing reviews of software or Web sites for professional journals; and (3) college administrators evaluating the quality of…
Standardized Testing in Physics via the World Wide Web.
ERIC Educational Resources Information Center
MacIsaac, Dan; Cole, Rebecca Pollard; Cole, David M.; McCullough, Laura; Maxka, Jim
2002-01-01
Examines the differences in paper-based and web-based administrations of a commonly used assessment instrument, the Force Concept Inventory (FCI). Results demonstrated no appreciable difference on FCI scores or FCI items based on the type of administration. Concludes that the web-based administration of the FCI appears to be as efficacious as the…
Operational Use of OGC Web Services at the Met Office
NASA Astrophysics Data System (ADS)
Wright, Bruce
2010-05-01
The Met Office has adopted the Service-Orientated Architecture paradigm to deliver services to a range of customers through Rich Internet Applications (RIAs). The approach uses standard Open Geospatial Consortium (OGC) web services to provide information to web-based applications through a range of generic data services. "Invent", the Met Office beta site, is used to showcase Met Office future plans for presenting web-based weather forecasts, product and information to the public. This currently hosts a freely accessible Weather Map Viewer, written in JavaScript, which accesses a Web Map Service (WMS), to deliver innovative web-based visualizations of weather and its potential impacts to the public. The intention is to engage the public in the development of new web-based services that more accurately meet their needs. As the service is intended for public use within the UK, it has been designed to support a user base of 5 million, the analysed level of UK web traffic reaching the Met Office's public weather information site. The required scalability has been realised through the use of multi-tier tile caching: - WMS requests are made for 256x256 tiles for fixed areas and zoom levels; - a Tile Cache, developed in house, efficiently serves tiles on demand, managing WMS request for the new tiles; - Edge Servers, externally hosted by Akamai, provide a highly scalable (UK-centric) service for pre-cached tiles, passing new requests to the Tile Cache; - the Invent Weather Map Viewer uses the Google Maps API to request tiles from Edge Servers. (We would expect to make use of the Web Map Tiling Service, when it becomes an OGC standard.) The Met Office delivers specialist commercial products to market sectors such as transport, utilities and defence, which exploit a Web Feature Service (WFS) for data relating forecasts and observations to specific geographic features, and a Web Coverage Service (WCS) for sub-selections of gridded data. These are locally rendered as maps or graphs, and combined with the WMS pre-rendered images and text, in a FLEX application, to provide sophisticated, user impact-based view of the weather. The OGC web services supporting these applications have been developed in collaboration with commercial companies. Visual Weather was originally a desktop application for forecasters, but IBL have developed it to expose the full range of forecast and observation data through standard web services (WCS and WMS). Forecasts and observations relating to specific locations and geographic features are held in an Oracle Database, and exposed as a WFS using Snowflake Software's GO-Publisher application. The Met Office has worked closely with both IBL and Snowflake Software to ensure that the web services provided strike a balance between conformance to the standards and performance in an operational environment. This has proved challenging in areas where the standards are rapidly evolving (e.g. WCS) or do not allow adequate description of the Met-Ocean domain (e.g. multiple time coordinates and parametric vertical coordinates). It has also become clear that careful selection of the features to expose, based on the way in which you expect users to query those features, in necessary in order to deliver adequate performance. These experiences are providing useful 'real-world' input in to the recently launched OGC MetOcean Domain Working Group and World Meteorological Organisation (WMO) initiatives in this area.
Jiang, Guoqian; Evans, Julie; Endle, Cory M; Solbrig, Harold R; Chute, Christopher G
2016-01-01
The Biomedical Research Integrated Domain Group (BRIDG) model is a formal domain analysis model for protocol-driven biomedical research, and serves as a semantic foundation for application and message development in the standards developing organizations (SDOs). The increasing sophistication and complexity of the BRIDG model requires new approaches to the management and utilization of the underlying semantics to harmonize domain-specific standards. The objective of this study is to develop and evaluate a Semantic Web-based approach that integrates the BRIDG model with ISO 21090 data types to generate domain-specific templates to support clinical study metadata standards development. We developed a template generation and visualization system based on an open source Resource Description Framework (RDF) store backend, a SmartGWT-based web user interface, and a "mind map" based tool for the visualization of generated domain-specific templates. We also developed a RESTful Web Service informed by the Clinical Information Modeling Initiative (CIMI) reference model for access to the generated domain-specific templates. A preliminary usability study is performed and all reviewers (n = 3) had very positive responses for the evaluation questions in terms of the usability and the capability of meeting the system requirements (with the average score of 4.6). Semantic Web technologies provide a scalable infrastructure and have great potential to enable computable semantic interoperability of models in the intersection of health care and clinical research.
Integrated Functional and Executional Modelling of Software Using Web-Based Databases
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak; Marietta, Roberta
1998-01-01
NASA's software subsystems undergo extensive modification and updates over the operational lifetimes. It is imperative that modified software should satisfy safety goals. This report discusses the difficulties encountered in doing so and discusses a solution based on integrated modelling of software, use of automatic information extraction tools, web technology and databases.
Graphical Response Exercises for Teaching Physics
ERIC Educational Resources Information Center
Bonham, Scott
2007-01-01
What is physics without graphs and diagrams? The web is becoming ubiquitous, but how can one expect students to make graphs and diagrams on the web? The solution is to extend functionality through Java applets. Four examples of exercises using the Physics Applets for Drawing (PADs) will illustrate how these can be used for physics instruction to…
Collaborative Learning and Knowledge-Construction through a Knowledge-Based WWW Authoring Tool.
ERIC Educational Resources Information Center
Haugsjaa, Erik
This paper outlines hurdles to using the World Wide Web for learning, specifically in a collaborative knowledge-construction environment. Theoretical solutions based directly on existing Web environments, as well as on research and system prototypes in the areas of Intelligent Tutoring Systems (ITS) and ITS authoring systems, are suggested. Topics…
Jung, Timothy T K; John, Earnest O; Park, Seong Kook; Park, Yong Soo; Rhee, Chong-Ku
2004-02-01
Platelet activating factor (PAF), generated from biologically active phospholipids, has been implicated as a potent inflammatory mediator and has been shown to be involved in many pathological processes, especially in inflammation and allergy. It has been suspected that PAF may be one of the inflammatory mediators in middle ear effusion that can induce sensorineural hearing loss, as observed in chronic otitis media. The PAF receptor antagonist WEB2170 has been studied extensively, and its inhibitory effects against various PAF actions are well proven in otologic systems. The purpose of our study was to determine the effect of superfusion of PAF and WEB2170 on morphological changes in isolated cochlear outer hair cells (OHCs). Isolated OHCs from adult chinchilla cochleas were exposed to albumin-phosphate-buffered saline solution (1 mg/mL), WEB2170 (5 mg/30 mL), PAF (1 micromol/L), or both PAF (I micromol/L) and WEB2170 (5 mg/30 mL). All experiments were performed at an osmolality of 305 +/- 5 mOsm at room temperature for 30 minutes. The cells were observed with an inverted microscope; the images were stored and analyzed on the Image Pro-Plus program. The OHCs exposed to control albumin-phosphate-buffered saline solution or to WEB2170 did not show any significant change in cell shape or length. The cells exposed to 1 micromol/L of PAF showed ballooning and significant shortening of the mean cell length in 15 to 20 minutes. These morphological changes in OHCs can be prevented by pretreating OHCs with WEB2170. This study demonstrated that exposure to PAF causes morphological changes in isolated OHCs that can be prevented by the PAF receptor antagonist WEB2170.
Evaluating the Quality and Readability of Internet Information on Meningiomas.
Saeed, Fozia; Anderson, Ian
2017-01-01
The Internet is a highly powerful resource for patients and provides an extensive amount of information on medical conditions. It is therefore important that the information accessible is accurate, up to date, and at an appropriate comprehensive level for the general public. This article aims to evaluate the quality of patient information on meningiomas. The term meningioma was searched using the following search engines: Google, Bing, Yahoo, Ask, and AOL. The top 100 meningioma Web sites were analyzed for readability using the Flesch Reading Ease score and the Flesch-Kincaid grade level. The quality of each Web page was assessed with the DISCERN instrument and the Centers for Disease Control and Prevention (CDC) Clear Communication Index (CCI). The quality of information on the Internet on meningiomas is highly variable. The overall mean Flesch Reading Ease score was 43.1 (standard deviation = 13.3) and the mean Flesch-Kincaid grade of all the Web sites was 11.2 (standard deviation = 2.3). This finding suggests that the information is on average difficult to read. Only one Web site was at the recommended seventh-grade level and the remainder were above this grade. Only one third of the Web pages had Health On the Net Code of Conduct or The Information Standard certification and were found to be significantly of higher quality: DISCERN (P = 0.022) and CDC CCI (P = 0.027). More than 50% of the Web sites had significantly poor or average DISCERN scores and only 2 Web sites fulfilled the CDC CCI criteria. It is recommended that clinicians personally research material for their patients to be able to guide them to reliable and accurate Web sites. It is also encouraged to become Health On the Net Code of Conduct/The Information Standard certified because this may indicate information of high quality. In addition, it is also recommended that authors of existing information assess the quality of their online health information against the CDC CCI criteria. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
Digital hand atlas and computer-aided bone age assessment via the Web
NASA Astrophysics Data System (ADS)
Cao, Fei; Huang, H. K.; Pietka, Ewa; Gilsanz, Vicente
1999-07-01
A frequently used assessment method of bone age is atlas matching by a radiological examination of a hand image against a reference set of atlas patterns of normal standards. We are in a process of developing a digital hand atlas with a large standard set of normal hand and wrist images that reflect the skeletal maturity, race and sex difference, and current child development. The digital hand atlas will be used for a computer-aided bone age assessment via Web. We have designed and partially implemented a computer-aided diagnostic (CAD) system for Web-based bone age assessment. The system consists of a digital hand atlas, a relational image database and a Web-based user interface. The digital atlas is based on a large standard set of normal hand an wrist images with extracted bone objects and quantitative features. The image database uses a content- based indexing to organize the hand images and their attributes and present to users in a structured way. The Web-based user interface allows users to interact with the hand image database from browsers. Users can use a Web browser to push a clinical hand image to the CAD server for a bone age assessment. Quantitative features on the examined image, which reflect the skeletal maturity, will be extracted and compared with patterns from the atlas database to assess the bone age. The relevant reference imags and the final assessment report will be sent back to the user's browser via Web. The digital atlas will remove the disadvantages of the currently out-of-date one and allow the bone age assessment to be computerized and done conveniently via Web. In this paper, we present the system design and Web-based client-server model for computer-assisted bone age assessment and our initial implementation of the digital atlas database.
2011-01-01
Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies. PMID:22024447
Wilkinson, Mark D; Vandervalk, Benjamin; McCarthy, Luke
2011-10-24
The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies.
Boudreault, David J; Li, Chin-Shang; Wong, Michael S
2016-01-01
To evaluate the effect of web-based education on (1) patient satisfaction, (2) consultation times, and (3) conversion to surgery. A retrospective review of 767 new patient consultations seen by 4 university-based plastic surgeons was conducted between May 2012 and August 2013 to determine the effect a web-based education program has on patient satisfaction and consultation time. A standard 5-point Likert scale survey completed at the end of the consultation was used to assess satisfaction with their experience. Consult times were obtained from the electronic medical record. All analyses were done with Statistical Analysis Software version 9.2 (SAS Inc., Cary, NC). A P value less than 0.05 was considered statistically significant. Those who viewed the program before their consultation were more satisfied with their consultation compared to those who did not (satisfaction scores, mean ± SD: 1.13 ± 0.44 vs 1.36 ± 0.74; P = 0.02) and more likely to rate their experience as excellent (92% vs 75%; P = 0.02). Contrary to the claims of Emmi Solutions, patients who viewed the educational program before consultation trended toward longer visits compared to those who did not (mean time ± SD: 54 ± 26 vs 50 ± 35 minutes; P = 0.10). More patients who completed the program went on to undergo a procedure (44% vs 37%; P = 0.16), but this difference was not statistically significant. Viewing web-based educational programs significantly improved plastic surgery patients' satisfaction with their consultation, but patients who viewed the program also trended toward longer consultation times. Although there was an increase in converting to surgical procedures, this did not reach statistical significance.
A suite of R packages for web-enabled modeling and analysis of surface waters
NASA Astrophysics Data System (ADS)
Read, J. S.; Winslow, L. A.; Nüst, D.; De Cicco, L.; Walker, J. I.
2014-12-01
Researchers often create redundant methods for downloading, manipulating, and analyzing data from online resources. Moreover, the reproducibility of science can be hampered by complicated and voluminous data, lack of time for documentation and long-term maintenance of software, and fear of exposing programming skills. The combination of these factors can encourage unshared one-off programmatic solutions instead of openly provided reusable methods. Federal and academic researchers in the water resources and informatics domains have collaborated to address these issues. The result of this collaboration is a suite of modular R packages that can be used independently or as elements in reproducible analytical workflows. These documented and freely available R packages were designed to fill basic needs for the effective use of water data: the retrieval of time-series and spatial data from web resources (dataRetrieval, geoknife), performing quality assurance and quality control checks of these data with robust statistical methods (sensorQC), the creation of useful data derivatives (including physically- and biologically-relevant indices; GDopp, LakeMetabolizer), and the execution and evaluation of models (glmtools, rLakeAnalyzer). Here, we share details and recommendations for the collaborative coding process, and highlight the benefits of an open-source tool development pattern with a popular programming language in the water resources discipline (such as R). We provide examples of reproducible science driven by large volumes of web-available data using these tools, explore benefits of accessing packages as standardized web processing services (WPS) and present a working platform that allows domain experts to publish scientific algorithms in a service-oriented architecture (WPS4R). We assert that in the era of open data, tools that leverage these data should also be freely shared, transparent, and developed in an open innovation environment.
2008-03-01
Machine [29]. OC4J applications support Java Servlets , Web services, and the following J2EE specific standards: Extensible Markup Language (XML...IMAP Internet Message Access Protocol IP Internet Protocol IT Information Technology xviii J2EE Java Enterprise Environment JSR 168 Java ...LDAP), World Wide Web Distributed Authoring and Versioning (WebDav), Java Specification Request 168 (JSR 168), and Web Services for Remote
Modelling of Tethered Space-Web Structures
NASA Astrophysics Data System (ADS)
McKenzie, D. J.; Cartnell, M. P.
Large structures in space are an essential milestone in the path of many projects, from solar power collectors to space stations. In space, as on Earth, these large projects may be split up into more manageable sections, dividing the task into multiple replicable parts. Specially constructed spider robots could assemble these structures piece by piece over a membrane or space- web, giving a method for building a structure while on orbit. The modelling and applications of these space-webs are discussed, along with the derivation of the equations of motion of the structure. The presentation of some preliminary results from the solution of these equations will show that space-webs can take a variety of different forms, and give some guidelines for configuring the space-web system.
tOWL: a temporal Web Ontology Language.
Milea, Viorel; Frasincar, Flavius; Kaymak, Uzay
2012-02-01
Through its interoperability and reasoning capabilities, the Semantic Web opens a realm of possibilities for developing intelligent systems on the Web. The Web Ontology Language (OWL) is the most expressive standard language for modeling ontologies, the cornerstone of the Semantic Web. However, up until now, no standard way of expressing time and time-dependent information in OWL has been provided. In this paper, we present a temporal extension of the very expressive fragment SHIN(D) of the OWL Description Logic language, resulting in the temporal OWL language. Through a layered approach, we introduce three extensions: 1) concrete domains, which allow the representation of restrictions using concrete domain binary predicates; 2) temporal representation , which introduces time points, relations between time points, intervals, and Allen's 13 interval relations into the language; and 3) timeslices/fluents, which implement a perdurantist view on individuals and allow for the representation of complex temporal aspects, such as process state transitions. We illustrate the expressiveness of the newly introduced language by using an example from the financial domain.
Proposal for a Web Encoding Service (wes) for Spatial Data Transactio
NASA Astrophysics Data System (ADS)
Siew, C. B.; Peters, S.; Rahman, A. A.
2015-10-01
Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.
DDS as middleware of the Southern African Large Telescope control system
NASA Astrophysics Data System (ADS)
Maartens, Deneys S.; Brink, Janus D.
2016-07-01
The Southern African Large Telescope (SALT) software control system1 is realised as a distributed control system, implemented predominantly in National Instruments' LabVIEW. The telescope control subsystems communicate using cyclic, state-based messages. Currently, transmitting a message is accomplished by performing an HTTP PUT request to a WebDAV directory on a centralised Apache web server, while receiving is based on polling the web server for new messages. While the method works, it presents a number of drawbacks; a scalable distributed communication solution with minimal overhead is a better fit for control systems. This paper describes our exploration of the Data Distribution Service (DDS). DDS is a formal standard specification, defined by the Object Management Group (OMG), that presents a data-centric publish-subscribe model for distributed application communication and integration. It provides an infrastructure for platform- independent many-to-many communication. A number of vendors provide implementations of the DDS standard; RTI, in particular, provides a DDS toolkit for LabVIEW. This toolkit has been evaluated against the needs of SALT, and a few deficiencies have been identified. We have developed our own implementation that interfaces LabVIEW to DDS in order to address our specific needs. Our LabVIEW DDS interface implementation is built against the RTI DDS Core component, provided by RTI under their Open Community Source licence. Our needs dictate that the interface implementation be platform independent. Since we have access to the RTI DDS Core source code, we are able to build the RTI DDS libraries for any of the platforms on which we require support. The communications functionality is based on UDP multicasting. Multicasting is an efficient communications mechanism with low overheads which avoids duplicated point-to-point transmission of data on a network where there are multiple recipients of the data. In the paper we present a performance evaluation of DDS against the current HTTP-based implementation as well as the historical DataSocket implementation. We conclude with a summary and describe future work.
Graph-Based Semantic Web Service Composition for Healthcare Data Integration.
Arch-Int, Ngamnij; Arch-Int, Somjit; Sonsilphong, Suphachoke; Wanchai, Paweena
2017-01-01
Within the numerous and heterogeneous web services offered through different sources, automatic web services composition is the most convenient method for building complex business processes that permit invocation of multiple existing atomic services. The current solutions in functional web services composition lack autonomous queries of semantic matches within the parameters of web services, which are necessary in the composition of large-scale related services. In this paper, we propose a graph-based Semantic Web Services composition system consisting of two subsystems: management time and run time. The management-time subsystem is responsible for dependency graph preparation in which a dependency graph of related services is generated automatically according to the proposed semantic matchmaking rules. The run-time subsystem is responsible for discovering the potential web services and nonredundant web services composition of a user's query using a graph-based searching algorithm. The proposed approach was applied to healthcare data integration in different health organizations and was evaluated according to two aspects: execution time measurement and correctness measurement.
Graph-Based Semantic Web Service Composition for Healthcare Data Integration
2017-01-01
Within the numerous and heterogeneous web services offered through different sources, automatic web services composition is the most convenient method for building complex business processes that permit invocation of multiple existing atomic services. The current solutions in functional web services composition lack autonomous queries of semantic matches within the parameters of web services, which are necessary in the composition of large-scale related services. In this paper, we propose a graph-based Semantic Web Services composition system consisting of two subsystems: management time and run time. The management-time subsystem is responsible for dependency graph preparation in which a dependency graph of related services is generated automatically according to the proposed semantic matchmaking rules. The run-time subsystem is responsible for discovering the potential web services and nonredundant web services composition of a user's query using a graph-based searching algorithm. The proposed approach was applied to healthcare data integration in different health organizations and was evaluated according to two aspects: execution time measurement and correctness measurement. PMID:29065602
1997-07-18
the Marine Corps Exchange. If these sites contain commercial advertisements or sponsorships, the appropriate disclaimer below shall be given...or the information, products or services contained therein. For other than authorized activities such as military exchanges and Morale...posted to the commercial site. 4.9. Design Standards and Non-standard Features ( ActiveX and Java) 4.9.1. Design of publicly accessible web
National Institute of Standards and Technology Data Gateway
SRD 60 NIST ITS-90 Thermocouple Database (Web, free access) Web version of Standard Reference Database 60 and NIST Monograph 175. The database gives temperature -- electromotive force (emf) reference functions and tables for the letter-designated thermocouple types B, E, J, K, N, R, S and T. These reference functions have been adopted as standards by the American Society for Testing and Materials (ASTM) and the International Electrotechnical Commission (IEC).
Using web-based video to enhance physical examination skills in medical students.
Orientale, Eugene; Kosowicz, Lynn; Alerte, Anton; Pfeiffer, Carol; Harrington, Karen; Palley, Jane; Brown, Stacey; Sapieha-Yanchak, Teresa
2008-01-01
Physical examination (PE) skills among U.S. medical students have been shown to be deficient. This study examines the effect of a Web-based physical examination curriculum on first-year medical student PE skills. Web-based video clips, consisting of instruction in 77 elements of the physical examination, were created using Microsoft Windows Moviemaker software. Medical students' PE skills were evaluated by standardized patients before and after implementation of the Internet-based video. Following implementation of this curriculum, there was a higher level of competency (from 87% in 2002-2003 to 91% in 2004-2005), and poor performances on standardized patient PE exams substantially diminished (from a 14%-22%failure rate in 2002-2003, to 4% in 2004-2005. A significant improvement in first-year medical student performance on the adult PE occurred after implementing Web-based instructional video.
iSeq: Web-Based RNA-seq Data Analysis and Visualization.
Zhang, Chao; Fan, Caoqi; Gan, Jingbo; Zhu, Ping; Kong, Lei; Li, Cheng
2018-01-01
Transcriptome sequencing (RNA-seq) is becoming a standard experimental methodology for genome-wide characterization and quantification of transcripts at single base-pair resolution. However, downstream analysis of massive amount of sequencing data can be prohibitively technical for wet-lab researchers. A functionally integrated and user-friendly platform is required to meet this demand. Here, we present iSeq, an R-based Web server, for RNA-seq data analysis and visualization. iSeq is a streamlined Web-based R application under the Shiny framework, featuring a simple user interface and multiple data analysis modules. Users without programming and statistical skills can analyze their RNA-seq data and construct publication-level graphs through a standardized yet customizable analytical pipeline. iSeq is accessible via Web browsers on any operating system at http://iseq.cbi.pku.edu.cn .
BioXSD: the common data-exchange format for everyday bioinformatics web services.
Kalas, Matús; Puntervoll, Pål; Joseph, Alexandre; Bartaseviciūte, Edita; Töpfer, Armin; Venkataraman, Prabakar; Pettifer, Steve; Bryne, Jan Christian; Ison, Jon; Blanchet, Christophe; Rapacki, Kristoffer; Jonassen, Inge
2010-09-15
The world-wide community of life scientists has access to a large number of public bioinformatics databases and tools, which are developed and deployed using diverse technologies and designs. More and more of the resources offer programmatic web-service interface. However, efficient use of the resources is hampered by the lack of widely used, standard data-exchange formats for the basic, everyday bioinformatics data types. BioXSD has been developed as a candidate for standard, canonical exchange format for basic bioinformatics data. BioXSD is represented by a dedicated XML Schema and defines syntax for biological sequences, sequence annotations, alignments and references to resources. We have adapted a set of web services to use BioXSD as the input and output format, and implemented a test-case workflow. This demonstrates that the approach is feasible and provides smooth interoperability. Semantics for BioXSD is provided by annotation with the EDAM ontology. We discuss in a separate section how BioXSD relates to other initiatives and approaches, including existing standards and the Semantic Web. The BioXSD 1.0 XML Schema is freely available at http://www.bioxsd.org/BioXSD-1.0.xsd under the Creative Commons BY-ND 3.0 license. The http://bioxsd.org web page offers documentation, examples of data in BioXSD format, example workflows with source codes in common programming languages, an updated list of compatible web services and tools and a repository of feature requests from the community.
Learning about the Human Genome. Part 2: Resources for Science Educators. ERIC Digest.
ERIC Educational Resources Information Center
Haury, David L.
This ERIC Digest identifies how the human genome project fits into the "National Science Education Standards" and lists Human Genome Project Web sites found on the World Wide Web. It is a resource companion to "Learning about the Human Genome. Part 1: Challenge to Science Educators" (Haury 2001). The Web resources and…
The Effects of Implementing Web Accessibility Standards on the Success of Secondary Adolescents
ERIC Educational Resources Information Center
Savi, Christine Opitz; Savenye, Wilhelmina; Rowland, Cynthia
2008-01-01
Web accessibility has become a paramount concern in providing equal access to audiences of all abilities. Unless web accessibility is supported and employed, the internet does not deliver worldwide access as it was intended. This study engaged 60 students in a secondary school setting in order to identify the navigational effectiveness and…
Savel, Craig; Mierzwa, Stan; Gorbach, Pamina M.; Souidi, Samir; Lally, Michelle; Zimet, Gregory; Interventions, AIDS
2016-01-01
This paper reports on a specific Web-based self-report data collection system that was developed for a public health research study in the United States. Our focus is on technical outcome results and lessons learned that may be useful to other projects requiring such a solution. The system was accessible from any device that had a browser that supported HTML5. Report findings include: which hardware devices, Web browsers, and operating systems were used; the rate of survey completion; and key considerations for employing Web-based surveys in a clinical trial setting. PMID:28149445
Exploitation of Semantic Building Model in Indoor Navigation Systems
NASA Astrophysics Data System (ADS)
Anjomshoaa, A.; Shayeganfar, F.; Tjoa, A. Min
2009-04-01
There are many types of indoor and outdoor navigation tools and methodologies available. A majority of these solutions are based on Global Positioning Systems (GPS) and instant video and image processing. These approaches are ideal for open world environments where very few information about the target location is available, but for large scale building environments such as hospitals, governmental offices, etc the end-user will need more detailed information about the surrounding context which is especially important in case of people with special needs. This paper presents a smart indoor navigation solution that is based on Semantic Web technologies and Building Information Model (BIM). The proposed solution is also aligned with Google Android's concepts to enlighten the realization of results. Keywords: IAI IFCXML, Building Information Model, Indoor Navigation, Semantic Web, Google Android, People with Special Needs 1 Introduction Built environment is a central factor in our daily life and a big portion of human life is spent inside buildings. Traditionally the buildings are documented using building maps and plans by utilization of IT tools such as computer-aided design (CAD) applications. Documenting the maps in an electronic way is already pervasive but CAD drawings do not suffice the requirements regarding effective building models that can be shared with other building-related applications such as indoor navigation systems. The navigation in built environment is not a new issue, however with the advances in emerging technologies like GPS, mobile and networked environments, and Semantic Web new solutions have been suggested to enrich the traditional building maps and convert them to smart information resources that can be reused in other applications and improve the interpretability with building inhabitants and building visitors. Other important issues that should be addressed in building navigation scenarios are location tagging and end-user communication. The available solutions for location tagging are mostly based on proximity sensors and the information are bound to sensor references. In the proposed solution of this paper, the sensors simply play a role similar to annotations in Semantic Web world. Hence the sensors data in ontology sense bridges the gap between sensed information and building model. Combining these two and applying the proper inference rules, the building visitors will be able to reach their destinations with instant support of their communication devices such as hand helds, wearable computers, mobiles, etc. In a typical scenario of this kind, user's profile will be delivered to the smart building (via building ad-hoc services) and the appropriate route for user will be calculated and delivered to user's end-device. The calculated route is calculated by considering all constraints and requirements of the end user. So for example if the user is using a wheelchair, the calculated route should not contain stairs or narrow corridors that the wheelchair does not pass through. Then user starts to navigate through building by following the instructions of the end-device which are in turn generated from the calculated route. During the navigation process, the end-device should also interact with the smart building to sense the locations by reading the surrounding tags. So for example when a visually impaired person arrives at an unknown space, the tags will be sensed and the relevant information will be delivered to user in the proper way of communication. For example the building model can be used to generate a voice message for a blind person about a space and tell him/her that "the space has 3 doors, and the door on the left should be chosen which needs to be pushed to open". In this paper we will mainly focus on automatic generation of semantic building information models (Semantic BIM) and delivery of results to the end user. Combining the building information model with the environment and user constraints using Semantic Web technologies will make many scenarios conceivable. The generated IFC ontology that is base on the commonly accepted IFC (Industry Foundation Classes) standard can be used as the basis of information sharing between buildings, people, and applications. The proposed solution is aiming to facilitate the building navigation in an intuitive and extendable way that is easy to use by end-users and at the same time easy to maintain and manage by building administrators.
Building America Top Innovations 2013 Profile – Building America Solution Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
2013-09-01
This Top Innovation profile provides information about the Building America Solution Center created by Pacific Northwest National Laboratory, a web tool connecting users to thousands of pieces of building science information developed by DOE’s Building America research partners.
Bleda, Marta; Tarraga, Joaquin; de Maria, Alejandro; Salavert, Francisco; Garcia-Alonso, Luz; Celma, Matilde; Martin, Ainoha; Dopazo, Joaquin; Medina, Ignacio
2012-07-01
During the past years, the advances in high-throughput technologies have produced an unprecedented growth in the number and size of repositories and databases storing relevant biological data. Today, there is more biological information than ever but, unfortunately, the current status of many of these repositories is far from being optimal. Some of the most common problems are that the information is spread out in many small databases; frequently there are different standards among repositories and some databases are no longer supported or they contain too specific and unconnected information. In addition, data size is increasingly becoming an obstacle when accessing or storing biological data. All these issues make very difficult to extract and integrate information from different sources, to analyze experiments or to access and query this information in a programmatic way. CellBase provides a solution to the growing necessity of integration by easing the access to biological data. CellBase implements a set of RESTful web services that query a centralized database containing the most relevant biological data sources. The database is hosted in our servers and is regularly updated. CellBase documentation can be found at http://docs.bioinfo.cipf.es/projects/cellbase.
Corredor, Iván; Metola, Eduardo; Bernardos, Ana M; Tarrío, Paula; Casar, José R
2014-04-29
In the last few years, many health monitoring systems have been designed to fullfil the needs of a large range of scenarios. Although many of those systems provide good ad hoc solutions, most of them lack of mechanisms that allow them to be easily reused. This paper is focused on describing an open platform, the micro Web of Things Open Platform (µWoTOP), which has been conceived to improve the connectivity and reusability of context data to deliver different kinds of health, wellness and ambient home care services. µWoTOP is based on a resource-oriented architecture which may be embedded in mobile and resource constrained devices enabling access to biometric, ambient or activity sensors and actuator resources through uniform interfaces defined according to a RESTful fashion. Additionally, µWoTOP manages two communication modes which allow delivering user context information according to different methods, depending on the requirements of the consumer application. It also generates alert messages based on standards related to health care and risk management, such as the Common Alerting Protocol, in order to make its outputs compatible with existing systems.
COTS technologies for telemedicine applications.
Triunfo, Riccardo; Tumbarello, Roberto; Sulis, Alessandro; Zanetti, Gianluigi; Lianas, Luca; Meloni, Vittorio; Frexia, Francesca
2010-01-01
To demonstrate a simple low-cost system for tele-echocardiology, focused on paediatric cardiology applications. The system was realized using open-source software and COTS technologies. It is based on the transmission of two simultaneous video streams, obtained by direct digitization of the output of an ultrasound machine and by a netcam showing the examination that is taking place. These streams are then embedded into a web page so they are accessible, together with basic video controls, via a standard web browser. The system can also record video streams on a server for further use. The system was tested on a small group of neonatal cases with suspected cardiopathies for a preliminary assessment of its features and diagnostic capabilities. Both the clinical and technological results were encouraging and are leading the way for further experimentation. The presented system can transfer clinical images and videos in an efficient way and in real time. It can be used in the same hospital to support internal consultancy requests, in remote areas using Internet connections and for didactic purposes using low cost COTS appliances and simple interfaces for end users. The solution proposed can be extended to control different medical appliances in those remote hospitals.
Modeling Adaptable Business Service for Enterprise Collaboration
NASA Astrophysics Data System (ADS)
Boukadi, Khouloud; Vincent, Lucien; Burlat, Patrick
Nowadays, a Service Oriented Architecture (SOA) seems to be one of the most promising paradigms for leveraging enterprise information systems. SOA creates opportunities for enterprises to provide value added service tailored for on demand enterprise collaboration. With the emergence and rapid development of Web services technologies, SOA is being paid increasing attention and has become widespread. In spite of the popularity of SOA, a standardized framework for modeling and implementing business services are still in progress. For the purpose of supporting these service-oriented solutions, we adopt a model driven development approach. This paper outlines the Contextual Service Oriented Modeling and Analysis (CSOMA) methodology and presents UML profiles for the PIM level service-oriented architectural modeling, as well as its corresponding meta-models. The proposed PIM (Platform Independent Model) describes the business SOA at a high level of abstraction regardless of techniques involved in the application employment. In addition, all essential service-specific concerns required for delivering quality and context-aware service are covered. Some of the advantages of this approach are that it is generic and thus not closely allied with Web service technology as well as specifically treating the service adaptability during the design stage.
Corredor, Iván; Metola, Eduardo; Bernardos, Ana M.; Tarrío, Paula; Casar, José R.
2014-01-01
In the last few years, many health monitoring systems have been designed to fullfil the needs of a large range of scenarios. Although many of those systems provide good ad hoc solutions, most of them lack of mechanisms that allow them to be easily reused. This paper is focused on describing an open platform, the micro Web of Things Open Platform (µWoTOP), which has been conceived to improve the connectivity and reusability of context data to deliver different kinds of health, wellness and ambient home care services. µWoTOP is based on a resource-oriented architecture which may be embedded in mobile and resource constrained devices enabling access to biometric, ambient or activity sensors and actuator resources through uniform interfaces defined according to a RESTful fashion. Additionally, µWoTOP manages two communication modes which allow delivering user context information according to different methods, depending on the requirements of the consumer application. It also generates alert messages based on standards related to health care and risk management, such as the Common Alerting Protocol, in order to make its outputs compatible with existing systems. PMID:24785542
Staccini, Pascal; Dufour, Jean -Charles; Joubert, Michel; Michiels, Jean -François; Fieschi, Marius
2003-01-01
Nowadays, web-based learning services are a key topic in the pedagogical and learning strategies of universities. While organisational and teaching requirements of the learning environment are being evaluated, technical specifications are emerging, enabling educators to build advanced "units of learning". Changes, however, take a long time and cost-effective solutions have to be found to involve our institutions in such actions. In this paper, we present a model of the components of a course. We detail the method followed to implement this model in hypermedia modules with a viewer that can be played on line or from a CD-ROM. The XML technology has been used to implement all the data structures and a client-side architecture has been designed to build a course viewer. Standards of description of content (such as Dublin Core and DocBook) have been integrated into the data structures. This tool has been populated with data from a pathology course and supports other medical contents. The choice of the architecture and the usefulness of the programming tools are discussed. The means of migrating towards a server-side application are presented.
STOCK: Structure mapper and online coarse-graining kit for molecular simulations
Bevc, Staš; Junghans, Christoph; Praprotnik, Matej
2015-03-15
We present a web toolkit STructure mapper and Online Coarse-graining Kit for setting up coarse-grained molecular simulations. The kit consists of two tools: structure mapping and Boltzmann inversion tools. The aim of the first tool is to define a molecular mapping from high, e.g. all-atom, to low, i.e. coarse-grained, resolution. Using a graphical user interface it generates input files, which are compatible with standard coarse-graining packages, e.g. VOTCA and DL_CGMAP. Our second tool generates effective potentials for coarse-grained simulations preserving the structural properties, e.g. radial distribution functions, of the underlying higher resolution model. The required distribution functions can be providedmore » by any simulation package. Simulations are performed on a local machine and only the distributions are uploaded to the server. The applicability of the toolkit is validated by mapping atomistic pentane and polyalanine molecules to a coarse-grained representation. Effective potentials are derived for systems of TIP3P (transferable intermolecular potential 3 point) water molecules and salt solution. The presented coarse-graining web toolkit is available at http://stock.cmm.ki.si.« less
Using JavaScript and the FDSN web service to create an interactive earthquake information system
NASA Astrophysics Data System (ADS)
Fischer, Kasper D.
2015-04-01
The FDSN web service provides a web interface to access earthquake meta-data (e. g. event or station information) and waveform date over the internet. Requests are send to a server as URLs and the output is either XML or miniSEED. This makes it hard to read by humans but easy to process with different software. Different data centers are already supporting the FDSN web service, e. g. USGS, IRIS, ORFEUS. The FDSN web service is also part of the Seiscomp3 (http://www.seiscomp3.org) software. The Seismological Observatory of the Ruhr-University switched to Seiscomp3 as the standard software for the analysis of mining induced earthquakes at the beginning of 2014. This made it necessary to create a new web-based earthquake information service for the publication of results to the general public. This has be done by processing the output of a FDSN web service query by javascript running in a standard browser. The result is an interactive map presenting the observed events and further information of events and stations on a single web page as a table and on a map. In addition the user can download event information, waveform data and station data in different formats like miniSEED, quakeML or FDSNxml. The developed code and all used libraries are open source and freely available.
Classroom Practice: From Worn-Out to Web-Based--Better Student Portfolios
ERIC Educational Resources Information Center
Diehm, Celleste
2004-01-01
In this article, the author suggests solutions to unleash student creativity. The article focuses on the author's idea for electronic portfolios, Web-based collections of a student's work. To put her idea into practice, the author created an electronic portfolio project that spanned five 90-minute class sessions (about one session every week or…
ERIC Educational Resources Information Center
Perez, Stella
This document describes LeagueTLC: Transformational Learning Connections (http://www.league.org/leaguetlc/index.htm), a Web site created by the League for Innovation in the Community College with funding from the Fund for the Improvement of Post Secondary Education (FIPSE). This Web site serves as a resource for community colleges by disseminating…
ERIC Educational Resources Information Center
Corder, Greg
2005-01-01
Science teachers face challenges that affect the quality of instruction. Tight budgets, limited resources, school schedules, and other obstacles limit students' opportunities to experience science that is visual and interactive. Incorporating web-based Java applets into science instruction offers a practical solution to these challenges. The…
User Identification and Tracking in an Educational Web Environment.
ERIC Educational Resources Information Center
Marzo-Lazaro, J. L.; Verdu-Carbo, T.; Fabregat-Gesa, R.
This paper describes a solution to the user identification and tracking problem within an educational World Wide Web environment. The paper begins with an overview of the Teaching Support System project at the University of Girona (Spain); the main objective of the project is to create an integrated set of tools for teachers to use to create and…
Analysis of Java Client/Server and Web Programming Tools for Development of Educational Systems.
ERIC Educational Resources Information Center
Muldner, Tomasz
This paper provides an analysis of old and new programming tools for development of client/server programs, particularly World Wide Web-based programs. The focus is on development of educational systems that use interactive shared workspaces to provide portable and expandable solutions. The paper begins with a short description of relevant terms.…
ERIC Educational Resources Information Center
Machovec, George S., Ed.
1995-01-01
Explains the Common Gateway Interface (CGI) protocol as a set of rules for passing information from a Web server to an external program such as a database search engine. Topics include advantages over traditional client/server solutions, limitations, sample library applications, and sources of information from the Internet. (LRW)
ERIC Educational Resources Information Center
Miltiadou, Marios; McIsaac, Marina S.
The purpose of this paper is to review problems encountered in World Wide Web-based courses delivered at three different educational institutions (i.e., two community colleges and a university) in the metropolitan Phoenix (Arizona) area. Implications are discussed based on distance education theories of interaction. Interaction is a vital issue to…
Developing a Web 2.0-Based System with User-Authored Content for Community Use and Teacher Education
ERIC Educational Resources Information Center
Cifuentes, Lauren; Sharp, Amy; Bulu, Sanser; Benz, Mike; Stough, Laura M.
2010-01-01
We report on an investigation into the design, development, implementation, and evaluation of an informational and instructional Website in order to generate guidelines for instructional designers of read/write Web environments. We describe the process of design and development research, the problem addressed, the theory-based solution, and the…
Evaluating the Accessibility of Web-Based Instruction for Students with Disabilities.
ERIC Educational Resources Information Center
Hinn, D. Michelle
This paper presents the methods and results of a year-long evaluation study, conducted for the purpose of determining disability accessibility barriers and potential solutions for those barriers found in four World Wide Web-based learning environments. The primary questions used to frame the evaluation study were: (1) Are there any features of the…
ERIC Educational Resources Information Center
Asuman, Baguma; Khan, Md. Shahadat Hossain; Clement, Che Kum
2018-01-01
This article reports on the barriers encountered by teachers and the possible solutions to the integration of web-based learning (WBL) into higher educational institutions in Uganda. A total of 50 teachers in the departments of ICT, management, and social sciences from five different universities were purposively selected. A self-designed…
Bringing Interactivity to the Web: The JAVA Solution.
ERIC Educational Resources Information Center
Knee, Richard H.; Cafolla, Ralph
Java is an object-oriented programming language of the Internet. It's popularity lies in its ability to create interactive Web sites across platforms. The most common Java programs are applications and applets, which adhere to a set of conventions that lets them run within a Java-compatible browser. Java is becoming an essential subject matter and…
NASA Astrophysics Data System (ADS)
Menguy, Theotime
Because of its critical nature, avionic industry is bound with numerous constraints such as security standards and certifications while having to fulfill the clients' desires for personalization. In this context, variability management is a very important issue for re-engineering projects of avionic softwares. In this thesis, we propose a new approach, based on formal concept analysis and semantic web, to support variability management. The first goal of this research is to identify characteristic behaviors and interactions of configuration variables in a dynamically configured system. To identify such elements, we used formal concept analysis on different levels of abstractions in the system and defined new metrics. Then, we built a classification for the configuration variables and their relations in order to enable a quick identification of a variable's behavior in the system. This classification could help finding a systematic approach to process variables during a re-engineering operation, depending on their category. To have a better understanding of the system, we also studied the shared controls of code between configuration variables. A second objective of this research is to build a knowledge platform to gather the results of all the analysis performed, and to store any additional element relevant in the variability management context, for instance new results helping define re-engineering process for each of the categories. To address this goal, we built a solution based on a semantic web, defining a new ontology, very extensive and enabling to build inferences related to the evolution processes. The approach presented here is, to the best of our knowledge, the first classification of configuration variables of a dynamically configured software and an original use of documentation and variability management techniques using semantic web in the aeronautic field. The analysis performed and the final results show that formal concept analysis is a way to identify specific properties and behaviors and that semantic web is a good solution to store and explore the results. However, the use of formal concept analysis with new boolean relations, such as the link between configuration variables and files, and the definition of new inferences may be a way to draw better conclusions. The use of the same methodology with other systems would enable to validate the approach in other contexts.
User interface and patient involvement.
Andreassen, Hege Kristin; Lundvoll Nilsen, Line
2013-01-01
Increased patient involvement is a goal in contemporary health care, and of importance to the development of patient oriented ICT. In this paper we discuss how the design of patient-user interfaces can affect patient involvement. Our discussion is based on 12 semi-structured interviews with patient users of a web-based solution for patient--doctor communication piloted in Norway. We argue ICT solutions offering a choice of user interfaces on the patient side are preferable to ensure individual accommodation and a high degree of patient involvement. When introducing web-based tools for patient--health professional communication a free-text option should be provided to the patient users.
Electronic Health Records: An Enhanced Security Paradigm to Preserve Patient's Privacy
NASA Astrophysics Data System (ADS)
Slamanig, Daniel; Stingl, Christian
In recent years, demographic change and increasing treatment costs demand the adoption of more cost efficient, highly qualitative and integrated health care processes. The rapid growth and availability of the Internet facilitate the development of eHealth services and especially of electronic health records (EHRs) which are promising solutions to meet the aforementioned requirements. Considering actual web-based EHR systems, patient-centric and patient moderated approaches are widely deployed. Besides, there is an emerging market of so called personal health record platforms, e.g. Google Health. Both concepts provide a central and web-based access to highly sensitive medical data. Additionally, the fact that these systems may be hosted by not fully trustworthy providers necessitates to thoroughly consider privacy issues. In this paper we define security and privacy objectives that play an important role in context of web-based EHRs. Furthermore, we discuss deployed solutions as well as concepts proposed in the literature with respect to this objectives and point out several weaknesses. Finally, we introduce a system which overcomes the drawbacks of existing solutions by considering an holistic approach to preserve patient's privacy and discuss the applied methods.
Reaction time effects in lab- versus Web-based research: Experimental evidence.
Hilbig, Benjamin E
2016-12-01
Although Web-based research is now commonplace, it continues to spur skepticism from reviewers and editors, especially whenever reaction times are of primary interest. Such persistent preconceptions are based on arguments referring to increased variation, the limits of certain software and technologies, and a noteworthy lack of comparisons (between Web and lab) in fully randomized experiments. To provide a critical test, participants were randomly assigned to complete a lexical decision task either (a) in the lab using standard experimental software (E-Prime), (b) in the lab using a browser-based version (written in HTML and JavaScript), or (c) via the Web using the same browser-based version. The classical word frequency effect was typical in size and corresponded to a very large effect in all three conditions. There was no indication that the Web- or browser-based data collection was in any way inferior. In fact, if anything, a larger effect was obtained in the browser-based conditions than in the condition relying on standard experimental software. No differences between Web and lab (within the browser-based conditions) could be observed, thus disconfirming any substantial influence of increased technical or situational variation. In summary, the present experiment contradicts the still common preconception that reaction time effects of only a few hundred milliseconds cannot be detected in Web experiments.
A New User Interface for On-Demand Customizable Data Products for Sensors in a SensorWeb
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Sullivan, Don
2011-01-01
A SensorWeb is a set of sensors, which can consist of ground, airborne and space-based sensors interoperating in an automated or autonomous collaborative manner. The NASA SensorWeb toolbox, developed at NASA/GSFC in collaboration with NASA/JPL, NASA/Ames and other partners, is a set of software and standards that (1) enables users to create virtual private networks of sensors over open networks; (2) provides the capability to orchestrate their actions; (3) provides the capability to customize the output data products and (4) enables automated delivery of the data products to the users desktop. A recent addition to the SensorWeb Toolbox is a new user interface, together with web services co-resident with the sensors, to enable rapid creation, loading and execution of new algorithms for processing sensor data. The web service along with the user interface follows the Open Geospatial Consortium (OGC) standard called Web Coverage Processing Service (WCPS). This presentation will detail the prototype that was built and how the WCPS was tested against a HyspIRI flight testbed and an elastic computation cloud on the ground with EO-1 data. HyspIRI is a future NASA decadal mission. The elastic computation cloud stores EO-1 data and runs software similar to Amazon online shopping.
Department of Agriculture, Food Safety and Inspection Service
... FSIS Forms Administrative Forms Standard Forms Skip Navigation Web Content Viewer (JSR 286) Actions ${title} Loading... Information ... resources and information on Siluriformes fish, including catfish Web Content Viewer (JSR 286) Actions ${title} Loading... Information ...
Medina-Aunon, J. Alberto; Martínez-Bartolomé, Salvador; López-García, Miguel A.; Salazar, Emilio; Navajas, Rosana; Jones, Andrew R.; Paradela, Alberto; Albar, Juan P.
2011-01-01
The development of the HUPO-PSI's (Proteomics Standards Initiative) standard data formats and MIAPE (Minimum Information About a Proteomics Experiment) guidelines should improve proteomics data sharing within the scientific community. Proteomics journals have encouraged the use of these standards and guidelines to improve the quality of experimental reporting and ease the evaluation and publication of manuscripts. However, there is an evident lack of bioinformatics tools specifically designed to create and edit standard file formats and reports, or embed them within proteomics workflows. In this article, we describe a new web-based software suite (The ProteoRed MIAPE web toolkit) that performs several complementary roles related to proteomic data standards. First, it can verify that the reports fulfill the minimum information requirements of the corresponding MIAPE modules, highlighting inconsistencies or missing information. Second, the toolkit can convert several XML-based data standards directly into human readable MIAPE reports stored within the ProteoRed MIAPE repository. Finally, it can also perform the reverse operation, allowing users to export from MIAPE reports into XML files for computational processing, data sharing, or public database submission. The toolkit is thus the first application capable of automatically linking the PSI's MIAPE modules with the corresponding XML data exchange standards, enabling bidirectional conversions. This toolkit is freely available at http://www.proteored.org/MIAPE/. PMID:21983993
Towards the novel reasoning among particles in PSO by the use of RDF and SPARQL.
Fister, Iztok; Yang, Xin-She; Ljubič, Karin; Fister, Dušan; Brest, Janez; Fister, Iztok
2014-01-01
The significant development of the Internet has posed some new challenges and many new programming tools have been developed to address such challenges. Today, semantic web is a modern paradigm for representing and accessing knowledge data on the Internet. This paper tries to use the semantic tools such as resource definition framework (RDF) and RDF query language (SPARQL) for the optimization purpose. These tools are combined with particle swarm optimization (PSO) and the selection of the best solutions depends on its fitness. Instead of the local best solution, a neighborhood of solutions for each particle can be defined and used for the calculation of the new position, based on the key ideas from semantic web domain. The preliminary results by optimizing ten benchmark functions showed the promising results and thus this method should be investigated further.
NASA Astrophysics Data System (ADS)
Yue, Songshan; Chen, Min; Wen, Yongning; Lu, Guonian
2016-04-01
Earth environment is extremely complicated and constantly changing; thus, it is widely accepted that the use of a single geo-analysis model cannot accurately represent all details when solving complex geo-problems. Over several years of research, numerous geo-analysis models have been developed. However, a collaborative barrier between model providers and model users still exists. The development of cloud computing has provided a new and promising approach for sharing and integrating geo-analysis models across an open web environment. To share and integrate these heterogeneous models, encapsulation studies should be conducted that are aimed at shielding original execution differences to create services which can be reused in the web environment. Although some model service standards (such as Web Processing Service (WPS) and Geo Processing Workflow (GPW)) have been designed and developed to help researchers construct model services, various problems regarding model encapsulation remain. (1) The descriptions of geo-analysis models are complicated and typically require rich-text descriptions and case-study illustrations, which are difficult to fully represent within a single web request (such as the GetCapabilities and DescribeProcess operations in the WPS standard). (2) Although Web Service technologies can be used to publish model services, model users who want to use a geo-analysis model and copy the model service into another computer still encounter problems (e.g., they cannot access the model deployment dependencies information). This study presents a strategy for encapsulating geo-analysis models to reduce problems encountered when sharing models between model providers and model users and supports the tasks with different web service standards (e.g., the WPS standard). A description method for heterogeneous geo-analysis models is studied. Based on the model description information, the methods for encapsulating the model-execution program to model services and for describing model-service deployment information are also included in the proposed strategy. Hence, the model-description interface, model-execution interface and model-deployment interface are studied to help model providers and model users more easily share, reuse and integrate geo-analysis models in an open web environment. Finally, a prototype system is established, and the WPS standard is employed as an example to verify the capability and practicability of the model-encapsulation strategy. The results show that it is more convenient for modellers to share and integrate heterogeneous geo-analysis models in cloud computing platforms.
Eng, J
1997-01-01
Java is a programming language that runs on a "virtual machine" built into World Wide Web (WWW)-browsing programs on multiple hardware platforms. Web pages were developed with Java to enable Web-browsing programs to overlay transparent graphics and text on displayed images so that the user could control the display of labels and annotations on the images, a key feature not available with standard Web pages. This feature was extended to include the presentation of normal radiologic anatomy. Java programming was also used to make Web browsers compatible with the Digital Imaging and Communications in Medicine (DICOM) file format. By enhancing the functionality of Web pages, Java technology should provide greater incentive for using a Web-based approach in the development of radiology teaching material.
Strategies of Successful Technology Integrators. Part I: Streamlining Classroom Management.
ERIC Educational Resources Information Center
McNally, Lynn; Etchison, Cindy
2000-01-01
Discussion of how to develop curriculum that successfully integrates technology into elementary and secondary school classrooms focuses on solutions for school and classroom management tasks. Highlights include Web-based solutions; student activities; word processing; desktop publishing; draw and paint programs; spreadsheets; and database…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-26
.... until 2 p.m. and a dinner break is scheduled from 5 p.m. until 6:30 p.m. The EPA's Web site for the... preferences on speaking times may not be able to be fulfilled. If you will require the service of a translator... hearing schedules, including lists of speakers, will be posted on the EPA's Web site at http://www.epa.gov...
NASA Astrophysics Data System (ADS)
Criado, Javier; Padilla, Nicolás; Iribarne, Luis; Asensio, Jose-Andrés
Due to the globalization of the information and knowledge society on the Internet, modern Web-based Information Systems (WIS) must be flexible and prepared to be easily accessible and manageable in real-time. In recent times it has received a special interest the globalization of information through a common vocabulary (i.e., ontologies), and the standardized way in which information is retrieved on the Web (i.e., powerful search engines, and intelligent software agents). These same principles of globalization and standardization should also be valid for the user interfaces of the WIS, but they are built on traditional development paradigms. In this paper we present an approach to reduce the gap of globalization/standardization in the generation of WIS user interfaces by using a real-time "bottom-up" composition perspective with COTS-interface components (type interface widgets) and trading services.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-26
... electronic form, will be posted on the NRC Web site and on the Federal rulemaking Web site http://www... that they do not want publicly disclosed. Federal Rulemaking Web site: Go to http://www.regulations.gov... CONTACT: Mr. Ian C. Jung, Chief, Instrumentation, Controls and Electrical Engineering Branch 2, Division...
A Research on E - learning Resources Construction Based on Semantic Web
NASA Astrophysics Data System (ADS)
Rui, Liu; Maode, Deng
Traditional e-learning platforms have the flaws that it's usually difficult to query or positioning, and realize the cross platform sharing and interoperability. In the paper, the semantic web and metadata standard is discussed, and a kind of e - learning system framework based on semantic web is put forward to try to solve the flaws of traditional elearning platforms.
Code of Federal Regulations, 2013 CFR
2013-10-01
.... (3) When an Internet Web site of the agent or broker is used to complete the QHP selection, at a minimum the Internet Web site must: (i) Disclose and display all QHP information provided by the Exchange... broker's Internet Web site for a QHP, prominently display a standardized disclaimer provided by HHS...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-04
...-and-r[email protected] , Attention Docket ID No. EPA-HQ-OAR-2003-0119. Facsimile: Fax your comments to... otherwise protected through http://www.regulations.gov or e-mail. The http://www.regulations.gov Web site is... following Web site: http://www.epa.gov/airquality/combustion . Please refer to this Web site to confirm the...
76 FR 63878 - New Source Performance Standards Review for Nitric Acid Plants
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-14
... comments. Agency Web site: http://www.epa.gov/oar/docket.html . Follow the instructions for submitting comments on the EPA Air and Radiation Docket Web site. E-mail: a-and-r[email protected] . Include EPA-HQ-OAR....regulations.gov Web site is an ``anonymous access'' system, which means that the EPA will not know your...
Employing Virtual Humans for Education and Training in X3D/VRML Worlds
ERIC Educational Resources Information Center
Ieronutti, Lucio; Chittaro, Luca
2007-01-01
Web-based education and training provides a new paradigm for imparting knowledge; students can access the learning material anytime by operating remotely from any location. Web3D open standards, such as X3D and VRML, support Web-based delivery of Educational Virtual Environments (EVEs). EVEs have a great potential for learning and training…
TreeVector: scalable, interactive, phylogenetic trees for the web.
Pethica, Ralph; Barker, Gary; Kovacs, Tim; Gough, Julian
2010-01-28
Phylogenetic trees are complex data forms that need to be graphically displayed to be human-readable. Traditional techniques of plotting phylogenetic trees focus on rendering a single static image, but increases in the production of biological data and large-scale analyses demand scalable, browsable, and interactive trees. We introduce TreeVector, a Scalable Vector Graphics-and Java-based method that allows trees to be integrated and viewed seamlessly in standard web browsers with no extra software required, and can be modified and linked using standard web technologies. There are now many bioinformatics servers and databases with a range of dynamic processes and updates to cope with the increasing volume of data. TreeVector is designed as a framework to integrate with these processes and produce user-customized phylogenies automatically. We also address the strengths of phylogenetic trees as part of a linked-in browsing process rather than an end graphic for print. TreeVector is fast and easy to use and is available to download precompiled, but is also open source. It can also be run from the web server listed below or the user's own web server. It has already been deployed on two recognized and widely used database Web sites.
CD-based image archival and management on a hybrid radiology intranet.
Cox, R D; Henri, C J; Bret, P M
1997-08-01
This article describes the design and implementation of a low-cost image archival and management solution on a radiology network consisting of UNIX, IBM personal computer-compatible (IBM, Purchase, NY) and Macintosh (Apple Computer, Cupertino, CA) workstations. The picture archiving and communications system (PACS) is modular, scaleable and conforms to the Digital Imaging and Communications in Medicine (DICOM) 3.0 standard for image transfer, storage and retrieval. Image data is made available on soft-copy reporting workstations by a work-flow management scheme and on desktop computers through a World Wide Web (WWW) interface. Data archival is based on recordable compact disc (CD) technology and is automated. The project has allowed the radiology department to eliminate the use of film in magnetic resonance (MR) imaging, computed tomography (CT) and ultrasonography.
The Orthanc Ecosystem for Medical Imaging.
Jodogne, Sébastien
2018-05-03
This paper reviews the components of Orthanc, a free and open-source, highly versatile ecosystem for medical imaging. At the core of the Orthanc ecosystem, the Orthanc server is a lightweight vendor neutral archive that provides PACS managers with a powerful environment to automate and optimize the imaging flows that are very specific to each hospital. The Orthanc server can be extended with plugins that provide solutions for teleradiology, digital pathology, or enterprise-ready databases. It is shown how software developers and research engineers can easily develop external software or Web portals dealing with medical images, with minimal knowledge of the DICOM standard, thanks to the advanced programming interface of the Orthanc server. The paper concludes by introducing the Stone of Orthanc, an innovative toolkit for the cross-platform rendering of medical images.
NASA Astrophysics Data System (ADS)
Wagner, Rick; Castanotto, Giuseppe; Goldberg, Kenneth A.
1995-11-01
The Internet offers tremendous potential for rapid development of mechanical products to meet global competition. In the past several years, a number of geometric algorithms have been developed to evaluate manufacturing properties such as feedability, fixturability, assemblability, etc. This class of algorithms is sometimes termed `DFX: Design for X'. One problem is that most of these algorithms are tailored to a particular CAD system and format and so have not been widely tested by industry. the World Wide Web may offer a solution: its simple interface language may become a de facto standard for the exchange of geometric data. In this preliminary paper we describe one model for remote analysis of CAD models that we believe holds promise for use in industry (e.g. during the design cycle) and in research (e.g. to encourage verification of results).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carnes, E.T.; Truett, D.F.; Truett, L.F.
In the handful of years since the World Wide Web (WWW or Web) came into being, Web sites have developed at an astonishing rate. With the influx of Web pages comes a disparity of site types, including personal homepages, commercial sales sites, and educational data. The variety of sites and the deluge of information contained on the Web exemplify the individual nature of the WWW. Whereas some people argue that it is this eclecticism which gives the Web its charm, we propose that sites which are repositories of technical data would benefit from standardization. This paper proffers a methodology formore » publishing ecological research on the Web. The template we describe uses capabilities of HTML (the HyperText Markup Language) to enhance the value of the traditional scientific paper.« less
ChemDoodle Web Components: HTML5 toolkit for chemical graphics, interfaces, and informatics.
Burger, Melanie C
2015-01-01
ChemDoodle Web Components (abbreviated CWC, iChemLabs, LLC) is a light-weight (~340 KB) JavaScript/HTML5 toolkit for chemical graphics, structure editing, interfaces, and informatics based on the proprietary ChemDoodle desktop software. The library uses
NASA Technical Reports Server (NTRS)
Laakso, J. H.; Smith, D. D.; Zimmerman, D. K.
1973-01-01
The fabrication of two shear web test elements and three large scale shear web test components are reported. In addition, the fabrication of test fixtures for the elements and components is described. The center-loaded beam test fixtures were configured to have a test side and a dummy or permanent side. The test fixtures were fabricated from standard extruded aluminum sections and plates and were designed to be reuseable.
Engineering Analysis Using a Web-based Protocol
NASA Technical Reports Server (NTRS)
Schoeffler, James D.; Claus, Russell W.
2002-01-01
This paper reviews the development of a web-based framework for engineering analysis. A one-dimensional, high-speed analysis code called LAPIN was used in this study, but the approach can be generalized to any engineering analysis tool. The web-based framework enables users to store, retrieve, and execute an engineering analysis from a standard web-browser. We review the encapsulation of the engineering data into the eXtensible Markup Language (XML) and various design considerations in the storage and retrieval of application data.
Beveridge, Allan
2006-01-01
The Internet consists of a vast inhomogeneous reservoir of data. Developing software that can integrate a wide variety of different data sources is a major challenge that must be addressed for the realisation of the full potential of the Internet as a scientific research tool. This article presents a semi-automated object-oriented programming system for integrating web-based resources. We demonstrate that the current Internet standards (HTML, CGI [common gateway interface], Java, etc.) can be exploited to develop a data retrieval system that scans existing web interfaces and then uses a set of rules to generate new Java code that can automatically retrieve data from the Web. The validity of the software has been demonstrated by testing it on several biological databases. We also examine the current limitations of the Internet and discuss the need for the development of universal standards for web-based data.
Developing Interoperable Air Quality Community Portals
NASA Astrophysics Data System (ADS)
Falke, S. R.; Husar, R. B.; Yang, C. P.; Robinson, E. M.; Fialkowski, W. E.
2009-04-01
Web portals are intended to provide consolidated discovery, filtering and aggregation of content from multiple, distributed web sources targeted at particular user communities. This paper presents a standards-based information architectural approach to developing portals aimed at air quality community collaboration in data access and analysis. An important characteristic of the approach is to advance beyond the present stand-alone design of most portals to achieve interoperability with other portals and information sources. We show how using metadata standards, web services, RSS feeds and other Web 2.0 technologies, such as Yahoo! Pipes and del.icio.us, helps increase interoperability among portals. The approach is illustrated within the context of the GEOSS Architecture Implementation Pilot where an air quality community portal is being developed to provide a user interface between the portals and clearinghouse of the GEOSS Common Infrastructure and the air quality community catalog of metadata and data services.
[A solution for display and processing of DICOM images in web PACS].
Xue, Wei-jing; Lu, Wen; Wang, Hai-yang; Meng, Jian
2009-03-01
Use the technique of Java Applet to realize the supporting of DICOM image in ordinary Web browser, thereby to expand the processing function of medical image. First analyze the format of DICOM file and design a class which can acquire the pixels, then design two Applet classes, of which one is used to disposal the DICOM image, the other is used to display DICOM image that have been disposaled in the first Applet. They all embedded in the View page, and they communicate by Applet Context object. The method designed in this paper can make users display and process DICOM images directly by using ordinary Web browser, which makes Web PACS not only have the advantages of B/S model, but also have the advantages of the C/S model. Java Applet is the key for expanding the Web browser's function in Web PACS, which provides a guideline to sharing of medical images.
Information-Flow-Based Access Control for Web Browsers
NASA Astrophysics Data System (ADS)
Yoshihama, Sachiko; Tateishi, Takaaki; Tabuchi, Naoshi; Matsumoto, Tsutomu
The emergence of Web 2.0 technologies such as Ajax and Mashup has revealed the weakness of the same-origin policy[1], the current de facto standard for the Web browser security model. We propose a new browser security model to allow fine-grained access control in the client-side Web applications for secure mashup and user-generated contents. We propose a browser security model that is based on information-flow-based access control (IBAC) to overcome the dynamic nature of the client-side Web applications and to accurately determine the privilege of scripts in the event-driven programming model.
Teixeira, Leonor; Saavedra, Vasco; Ferreira, Carlos; Sousa Santos, Beatriz
2010-01-01
Modern methods of information and communication that use web technologies provide an opportunity to facilitate closer communication between patients and healthcare providers, allowing a joint management of chronic diseases. This paper describes a web-based technological solution to support the management of inherited bleeding disorders integrating, diffusing and archiving large sets of data relating to the clinical practice of hemophilia care, more specifically the clinical practice at the Hematology Service of Coimbra Hospital Center (a Hemophilia Treatment Center located in Portugal).
Research on SaaS and Web Service Based Order Tracking
NASA Astrophysics Data System (ADS)
Jiang, Jianhua; Sheng, Buyun; Gong, Lixiong; Yang, Mingzhong
To solve the order tracking of across enterprises in Dynamic Virtual Enterprise (DVE), a SaaS and web service based order tracking solution was designed by analyzing the order management process in DVE. To achieve the system, the SaaS based architecture of data management on order tasks manufacturing states was constructed, and the encapsulation method of transforming application system into web service was researched. Then the process of order tracking in the system was given out. Finally, the feasibility of this study was verified by the development of a prototype system.
Pedersen, Natalia; Thielsen, Peter; Martinsen, Lars; Bennedsen, Mette; Haaber, Anne; Langholz, Ebbe; Végh, Zsuzsanna; Duricova, Dana; Jess, Tine; Bell, Sally; Burisch, Johan; Munkholm, Pia
2014-12-01
To individualize treatment with mesalazine for ulcerative colitis relapses through a self-managed, web-based solution to optimize the short-term disease course. Prospective, open-label, web-guided study with 3 months mesalazine therapy among patients with mild-to-moderate ulcerative colitis. Once a week, patients completed the simple clinical colitis activity index (SCCAI) and registered fecal calprotectin (FC) on the web application: www.meza.constant-care.dk. SCCAI and FC were summed and resulted in a total inflammatory burden score (TIBS). Deep remission was defined as SCCAI ≤1; FC = 0, and TIBS ≤1. A total of 95 patients (62% females; median age 45 yr) were included in the study and allocated 4.8 g mesalazine per day. Of these, 82 (86%) patients were adherent to web therapy, completing 3 months of web-guided mesalazine therapy. Of the 82 adherent patients, 72 (88%) continued mesalazine and 10 (12%) needed rescue therapy. From weeks 0 to 12, patients had experienced a significant reduction in mean SCCAI (4.6 versus 1.6, P < 0.001), mean FC (437 versus 195, P < 0.001), and mean TIBS (6.7 versus 2.4, P < 0.001). Based on TIBS values (≤1), the dose of mesalazine was reduced to 2.4 g in 25% of patients at week 3 in 50% of subjects at week 5 and in 88% of patients at week 12. Web-guided therapy with mesalazine in mild-to-moderate ulcerative colitis helps to individualize the dose and improve adherence to therapy. The study confirms mesalazine efficacy in mild-to-moderate UC, significantly improving TIBS values in majority of the patients.
ERIC Educational Resources Information Center
Burd, Elizabeth L.; Hatch, Andrew; Ashurst, Colin; Jessop, Alan
2009-01-01
This article describes an approach whereby patterns are used to describe management issues and solutions to be used during the project management of team-based software development. The work describes how web 2.0 technologies have been employed to support the use and development of such patterns. To evaluate the success of patterns and the…
A Calculating Web Site Could Ignite a New Campus "Math War"
ERIC Educational Resources Information Center
Young, Jeffrey R.
2009-01-01
The long-running debate over whether students should be allowed to wield calculators during mathematics exams may soon seem quaint. The latest dilemma facing professors is whether to let students turn to a Web site called WolframAlpha, which not only solves complex math problems, but also can spell out the steps leading to those solutions. In…
ERIC Educational Resources Information Center
PKI for Networked Higher Education Working Group.
2000-01-01
One barrier to Web-based education is lack of an effective system to identify and authorize involved participants, content, and institutions. PKI (public-key infrastructure) is an emerging technology that can certify the correct identity of each person and communication in Web-based learning. A June 2000 National Science Foundation workshop…
ERIC Educational Resources Information Center
Estrada, Luis
2012-01-01
The purpose of this study was to identify the obstacles to the adoption of Web 2.0 technologies as part of corporate learning solutions and strategies. The study followed a qualitative inquiry approach. The sample consisted of 20 corporate learning professionals who are members of the American Society for Training and Development (ASTD) social…
Technology Review of Multi-Agent Systems and Tools
2005-06-01
over a network, including the Internet. A web services architecture is the logical evolution of object-oriented analysis and design coupled with...the logical evolution of components geared towards the architecture, design, implementation, and deployment of e-business solutions. As in object...querying. The Web Services architecture describes the principles behind the next generation of e- business architectures, presenting a logical evolution
Spatial Guilds in the Serengeti Food Web Revealed by a Bayesian Group Model
Baskerville, Edward B.; Dobson, Andy P.; Bedford, Trevor; Allesina, Stefano; Anderson, T. Michael; Pascual, Mercedes
2011-01-01
Food webs, networks of feeding relationships in an ecosystem, provide fundamental insights into mechanisms that determine ecosystem stability and persistence. A standard approach in food-web analysis, and network analysis in general, has been to identify compartments, or modules, defined by many links within compartments and few links between them. This approach can identify large habitat boundaries in the network but may fail to identify other important structures. Empirical analyses of food webs have been further limited by low-resolution data for primary producers. In this paper, we present a Bayesian computational method for identifying group structure using a flexible definition that can describe both functional trophic roles and standard compartments. We apply this method to a newly compiled plant-mammal food web from the Serengeti ecosystem that includes high taxonomic resolution at the plant level, allowing a simultaneous examination of the signature of both habitat and trophic roles in network structure. We find that groups at the plant level reflect habitat structure, coupled at higher trophic levels by groups of herbivores, which are in turn coupled by carnivore groups. Thus the group structure of the Serengeti web represents a mixture of trophic guild structure and spatial pattern, in contrast to the standard compartments typically identified. The network topology supports recent ideas on spatial coupling and energy channels in ecosystems that have been proposed as important for persistence. Furthermore, our Bayesian approach provides a powerful, flexible framework for the study of network structure, and we believe it will prove instrumental in a variety of biological contexts. PMID:22219719
BioXSD: the common data-exchange format for everyday bioinformatics web services
Kalaš, Matúš; Puntervoll, Pæl; Joseph, Alexandre; Bartaševičiūtė, Edita; Töpfer, Armin; Venkataraman, Prabakar; Pettifer, Steve; Bryne, Jan Christian; Ison, Jon; Blanchet, Christophe; Rapacki, Kristoffer; Jonassen, Inge
2010-01-01
Motivation: The world-wide community of life scientists has access to a large number of public bioinformatics databases and tools, which are developed and deployed using diverse technologies and designs. More and more of the resources offer programmatic web-service interface. However, efficient use of the resources is hampered by the lack of widely used, standard data-exchange formats for the basic, everyday bioinformatics data types. Results: BioXSD has been developed as a candidate for standard, canonical exchange format for basic bioinformatics data. BioXSD is represented by a dedicated XML Schema and defines syntax for biological sequences, sequence annotations, alignments and references to resources. We have adapted a set of web services to use BioXSD as the input and output format, and implemented a test-case workflow. This demonstrates that the approach is feasible and provides smooth interoperability. Semantics for BioXSD is provided by annotation with the EDAM ontology. We discuss in a separate section how BioXSD relates to other initiatives and approaches, including existing standards and the Semantic Web. Availability: The BioXSD 1.0 XML Schema is freely available at http://www.bioxsd.org/BioXSD-1.0.xsd under the Creative Commons BY-ND 3.0 license. The http://bioxsd.org web page offers documentation, examples of data in BioXSD format, example workflows with source codes in common programming languages, an updated list of compatible web services and tools and a repository of feature requests from the community. Contact: matus.kalas@bccs.uib.no; developers@bioxsd.org; support@bioxsd.org PMID:20823319
A step-by-step solution for embedding user-controlled cines into educational Web pages.
Cornfeld, Daniel
2008-03-01
The objective of this article is to introduce a simple method for embedding user-controlled cines into a Web page using a simple JavaScript. Step-by-step instructions are included and the source code is made available. This technique allows the creation of portable Web pages that allow the user to scroll through cases as if seated at a PACS workstation. A simple JavaScript allows scrollable image stacks to be included on Web pages. With this technique, you can quickly and easily incorporate entire stacks of CT or MR images into online teaching files. This technique has the potential for use in case presentations, online didactics, teaching archives, and resident testing.
40 CFR 63.3320 - What emission standards must I meet?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 13 2013-07-01 2012-07-01 true What emission standards must I meet? 63... (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating Emission Standards...
Application of Open Source Technologies for Oceanographic Data Analysis
NASA Astrophysics Data System (ADS)
Huang, T.; Gangl, M.; Quach, N. T.; Wilson, B. D.; Chang, G.; Armstrong, E. M.; Chin, T. M.; Greguska, F.
2015-12-01
NEXUS is a data-intensive analysis solution developed with a new approach for handling science data that enables large-scale data analysis by leveraging open source technologies such as Apache Cassandra, Apache Spark, Apache Solr, and Webification. NEXUS has been selected to provide on-the-fly time-series and histogram generation for the Soil Moisture Active Passive (SMAP) mission for Level 2 and Level 3 Active, Passive, and Active Passive products. It also provides an on-the-fly data subsetting capability. NEXUS is designed to scale horizontally, enabling it to handle massive amounts of data in parallel. It takes a new approach on managing time and geo-referenced array data by dividing data artifacts into chunks and stores them in an industry-standard, horizontally scaled NoSQL database. This approach enables the development of scalable data analysis services that can infuse and leverage the elastic computing infrastructure of the Cloud. It is equipped with a high-performance geospatial and indexed data search solution, coupled with a high-performance data Webification solution free from file I/O bottlenecks, as well as a high-performance, in-memory data analysis engine. In this talk, we will focus on the recently funded AIST 2014 project by using NEXUS as the core for oceanographic anomaly detection service and web portal. We call it, OceanXtremes
This January 2004 document contains 14 diagrams illustrating the different compliance options available for those facilities that fall under the Paper and Web Coating Maximum Achievable control Technology (MACT).
40 CFR 63.3290 - Does this subpart apply to me?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Standards for Hazardous Air Pollutants: Paper and Other Web Coating What This Subpart Covers § 63.3290 Does... is a major source of HAP, as defined in § 63.2, at which web coating lines are operated. ...
40 CFR 63.3290 - Does this subpart apply to me?
Code of Federal Regulations, 2014 CFR
2014-07-01
...) National Emission Standards for Hazardous Air Pollutants: Paper and Other Web Coating What This Subpart... existing facility that is a major source of HAP, as defined in § 63.2, at which web coating lines are...
40 CFR 63.3290 - Does this subpart apply to me?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Standards for Hazardous Air Pollutants: Paper and Other Web Coating What This Subpart Covers § 63.3290 Does... is a major source of HAP, as defined in § 63.2, at which web coating lines are operated. ...
Easy access to geophysical data sets at the IRIS Data Management Center
NASA Astrophysics Data System (ADS)
Trabant, C.; Ahern, T.; Suleiman, Y.; Karstens, R.; Weertman, B.
2012-04-01
At the IRIS Data Management Center (DMC) we primarily manage seismological data but also have other geophysical data sets for related fields including atmospheric pressure and gravity measurements and higher level data products derived from raw data. With a few exceptions all data managed by the IRIS DMC are openly available and we serve an international research audience. These data are available via a number of different mechanisms from batch requests submitted through email, web interfaces, near real time streams and more recently web services. Our initial suite of web services offer access to almost all of the raw data and associated metadata managed at the DMC. In addition, we offer services that apply processing to the data before it is sent to the user. Web service technologies are ubiquitous with support available in nearly every programming language and operating system. By their nature web services are programmatic interfaces, but by choosing a simple subset of web service methods we make our data available to a very broad user base. These interfaces will be usable by professional developers as well as non-programmers. Whenever possible we chose open and recognized standards. The data returned to the user is in a variety of formats depending on type, including FDSN SEED, QuakeML, StationXML, ASCII, PNG images and in some cases where no appropriate standard could be found a customized XML format. To promote easy access to seismological data for all researchers we are coordinating with international partners to define web service interfaces standards. Additionally we are working with key partners in Europe to complete the initial implementation of these services. Once a standard has been adopted and implemented at multiple data centers researchers will be able to use the same request tools to access data across multiple data centers. The web services that apply on-demand processing to requested data include the capability to apply instrument corrections and format translations which ultimately allows more researchers to use the data without knowledge of specific data and metadata formats. In addition to serving as a new platform on top of which research scientists will build advanced processing tools we anticipate that they will result in more data being accessible by more users.
Optimizing Crawler4j using MapReduce Programming Model
NASA Astrophysics Data System (ADS)
Siddesh, G. M.; Suresh, Kavya; Madhuri, K. Y.; Nijagal, Madhushree; Rakshitha, B. R.; Srinivasa, K. G.
2017-06-01
World wide web is a decentralized system that consists of a repository of information on the basis of web pages. These web pages act as a source of information or data in the present analytics world. Web crawlers are used for extracting useful information from web pages for different purposes. Firstly, it is used in web search engines where the web pages are indexed to form a corpus of information and allows the users to query on the web pages. Secondly, it is used for web archiving where the web pages are stored for later analysis phases. Thirdly, it can be used for web mining where the web pages are monitored for copyright purposes. The amount of information processed by the web crawler needs to be improved by using the capabilities of modern parallel processing technologies. In order to solve the problem of parallelism and the throughput of crawling this work proposes to optimize the Crawler4j using the Hadoop MapReduce programming model by parallelizing the processing of large input data. Crawler4j is a web crawler that retrieves useful information about the pages that it visits. The crawler Crawler4j coupled with data and computational parallelism of Hadoop MapReduce programming model improves the throughput and accuracy of web crawling. The experimental results demonstrate that the proposed solution achieves significant improvements with respect to performance and throughput. Hence the proposed approach intends to carve out a new methodology towards optimizing web crawling by achieving significant performance gain.
Web Services as Building Blocks for an Open Coastal Observing System
NASA Astrophysics Data System (ADS)
Breitbach, G.; Krasemann, H.
2012-04-01
In coastal observing systems it is needed to integrate different observing methods like remote sensing, in-situ measurements, and models into a synoptic view of the state of the observed region. This integration can be based solely on web services combining data and metadata. Such an approach is pursued for COSYNA (Coastal Observing System for Northern and Artic seas). Data from satellite and radar remote sensing, measurements of buoys, stations and Ferryboxes are the observation part of COSYNA. These data are assimilated into models to create pre-operational forecasts. For discovering data an OGC Web Feature Service (WFS) is used by the COSYNA data portal. This Web Feature Service knows the necessary metadata not only for finding data, but in addition the URLs of web services to view and download the data. To make the data from different resources comparable a common vocabulary is needed. For COSYNA the standard names from CF-conventions are stored within the metadata whenever possible. For the metadata an INSPIRE and ISO19115 compatible data format is used. The WFS is fed from the metadata-system using database-views. Actual data are stored in two different formats, in NetCDF-files for gridded data and in an RDBMS for time-series-like data. The web service URLs are mostly standard based the standards are mainly OGC standards. Maps were created from netcdf files with the help of the ncWMS tool whereas a self-developed java servlet is used for maps of moving measurement platforms. In this case download of data is offered via OGC SOS. For NetCDF-files OPeNDAP is used for the data download. The OGC CSW is used for accessing extended metadata. The concept of data management in COSYNA will be presented which is independent of the special services used in COSYNA. This concept is parameter and data centric and might be useful for other observing systems.
web-based interactive data processing: application to stable isotope metrology.
Verkouteren, R M; Lee, J N
2001-08-01
To address a fundamental need in stable isotope metrology, the National Institute of Standards and Technology (NIST) has established a web-based interactive data-processing system accessible through a common gateway interface (CGI) program on the internet site http://www. nist.gov/widps-co2. This is the first application of a web-based tool that improves the measurement traceability afforded by a series of NIST standard materials. Specifically, this tool promotes the proper usage of isotope reference materials (RMs) and improves the quality of reported data from extensive measurement networks. Through the International Atomic Energy Agency (IAEA), we have defined standard procedures for stable isotope measurement and data-processing, and have determined and applied consistent reference values for selected NIST and IAEA isotope RMs. Measurement data of samples and RMs are entered into specified fields on the web-based form. These data are submitted through the CGI program on a NIST Web server, where appropriate calculations are performed and results returned to the client. Several international laboratories have independently verified the accuracy of the procedures and algorithm for measurements of naturally occurring carbon-13 and oxygen-18 abundances and slightly enriched compositions up to approximately 150% relative to natural abundances. To conserve the use of the NIST RMs, users may determine value assignments for a secondary standard to be used in routine analysis. Users may also wish to validate proprietary algorithms embedded in their laboratory instrumentation, or specify the values of fundamental variables that are usually fixed in reduction algorithms to see the effect on the calculations. The results returned from the web-based tool are limited in quality only by the measurements themselves, and further value may be realized through the normalization function. When combined with stringent measurement protocols, two- to threefold improvements have been realized in the reproducibility of carbon-13 and oxygen-18 determinations across laboratories.
Briand, Dominique; Roux, Emmanuel; Desconnets, Jean Christophe; Gervet, Carmen; Barcellos, Christovam
2018-03-21
Since prehistory to present times and despite a rough combat against it, malaria remains a concern for human beings. While evolutions of science and technology through times allowed for some infectious diseases eradication in the 20th century, malaria resists. This review aims at assessing how Internet and web technologies are used in fighting malaria. Precisely, how do malaria fighting actors profit from these developments, how do they deal with ensuing phenomena, such as the increase of data volume, and did these technologies bring new opportunities for fighting malaria? Eleven web platforms linked to spatio-temporal malaria information are reviewed, focusing on data, metadata, web services and categories of users. Though the web platforms are highly heterogeneous the review reveals that the latest advances in web technologies are underused. Information are rarely updated dynamically, metadata catalogues are absent, web services are more and more used, but rarely standardized, and websites are mainly dedicated to scientific communities, essentially researchers. Improvement of systems interoperability, through standardization, is an opportunity to be seized in order to allow real time information exchange and online multisource data analysis. To facilitate multidisciplinary/multiscale studies, the web of linked data and the semantic web innovations can be used in order to formalize the different view points of actors involved in the combat against malaria. By doing so, new malaria fighting strategies could take place, to tackle the bottlenecks listed in the United Nation Millennium Development Goals reports, but also specific issues highlighted by the World Health Organization such as malaria elimination in international borders.
NASA Astrophysics Data System (ADS)
Ahern, T. K.; Ekstrom, G.; Grobbelaer, M.; Trabant, C. M.; Van Fossen, M.; Stults, M.; Tsuboi, S.; Beaudoin, B. C.; Bondar, I.
2016-12-01
Seismology, by its very nature, requires sharing information across international boundaries and as such seismology evolved as a science that promotes free and open access to data. The International Federation of Digital Seismograph Networks (FDSN) has commission status within IASPEI and as such is the international standards body in our community. In the late 1980s a domain standard for exchanging seismological information was created and the SEED format is still the dominant domain standard. More recently the FDSN standardized web-service interfaces for key services used in our community. The standardization of these services also enabled the development of a federation of data centers. These federated centers, can be accessed through standard FDSN service calls. Client software exists that currently allows seamless and transparent access to all data managed at 14 globally distributed data centers on three continents with plans to expand this more broadly. IRIS is also involved in the EarthCube project funded by the US National Science Foundation. The GEOphysical Web Services (GeoWS) project extended the style of web services endorsed by the FDSN to interdisciplinary domains. IRIS worked with five data centers in other domains (Caltech, UCSD, Columbia University, UNAVCO and Unidata) to develop `similar' service-based interfaces to their data systems that were drawn from the oceanographic, atmospheric, and solid earth divisions within the NSF's geosciences directorate. Additionally IRIS developed GeoWS style web services for six additional data collections that included magnetic observations, field gravity measurements, superconducting gravimetry data, volcano monitoring data, tidal data, and oceanographic observations including those from cabled arrays in the ocean. This presentation will highlight the success the FDSN and GeoWS services have demonstrated within and beyond seismology as well as identifying some next steps being considered.
Fortier, Michelle A.; Bunzli, Elizabeth; Walthall, Jessica; Olshansky, Ellen; Saadat, Haleh; Santistevan, Ricci; Mayes, Linda; Kain, Zeev N.
2015-01-01
Background The purpose of this two-phase project was to conduct formative evaluation and test the preliminary efficacy of a newly developed web-based, tailored behavioral preparation program (WebTIPS) for children undergoing outpatient surgery and their parents Methods Phase I enrolled 13 children aged 2–7 years undergoing outpatient elective surgery and their parents for formative evaluation of WebTIPS. Parent participation focus groups which are common in qualitative research and are a method of asking research participants about their perceptions and attitudes regarding a product or concept. In phase II, children age 2–7 years in two medical centers were randomly assigned to receive the WebTIPS program (n = 38) compared to children receiving standard of care (n = 44). The primary outcome of phase II was child and parent preoperative anxiety. Results In phase I, parents reported WebTIPS to be both helpful (p < 0.001) and easy to use (p < 0.001). In phase II, children in the WebTIPS group (36.2 ± 14.1) were less anxious than children in the standard of care group (46.0 ± 19.0) at entrance to the operating room (p = 0.02; Cohen’s d = 0.59) and introduction of the anesthesia mask (43.5 ± 21.7 vs. 57.0 ± 21.2, respectively, p = 0.01; Cohen’s d = 0.63). Parents in the WebTIPS group (32.1 ± 7.4) also experienced less anxiety compared to parents in the control group (36.8 ± 7.1) in the preoperative holding area (p = 0.004; Cohen’s d = 0.65). Conclusions WebTIPS was well received by parents and children and led to reductions in preoperative anxiety. PMID:25790213