Microbiological contamination with fungi, including moulds, can pose a significant health hazard to those working in archives or museums. The species involved include Aspergillus, Penicillium, Geotrichum, Alternaria, Cladosporium, Mucor, Rhizopus, Trichoderma, Fusarium which are associated mostly with allergic response of different types. The aim of the study was to analyse, both in quantitative and qualitative terms, workplace air samples collected in a library and archive storage facilities. Occupational exposure and the related health hazard from microbiological contamination with moulds were assessed in three archive storage buildings and one library. Air samples (total 60) were collected via impact method before work and at hourly intervals during work performance. Surface samples from the artifacts were collected by pressing a counting (RODAC) plate filled with malt extract agar against the surface of the artifacts. The air sample and surface sample analyses yielded 36 different mould species, classified into 19 genera, of which Cladosporium and Penicillium were the most prevalent. Twelve species were regarded as potentially pathogenic for humans: 8 had allergic and 11 toxic properties, the latter including Aspergillus fumigatus. Quantitative analysis revealed air microbiological contamination with moulds at the level ranging from 1.8 x 10(2)-2.3 x 10(3) cfu/m(3). In surface samples from library and archive artifacts, 11 fungal species were distinguished; the number of species per artifact varying from 1-6 and colony count ranging from 4 x 10(1) to 8-10(1) cfu/100 cm(2). Higher contamination levels were found only for Cladosporium cladosporioides (1.48 x 10(3) cfu/100 cm(2)) and Paecillomyces varioti (1.2 x 10(2) cfu/100 cm(2)). At the workposts examined, although no clearly visible signs of mould contamination could be found, the study revealed abundant micromycetes, with the predominant species of Cladosporium and Penicillium. The detected species included also potentially pathogenic microorganisms which can cause allergic and toxic effects, such as Aspergillus fumigatus, that could be hazardous to workers' health. For some species, the concentration levels exceeded the values considered the proposed hygienic standards for total microscopical fungi in occupational settings. The findings of the study point to unsatisfactory hygienic conditions at the worksites examined, resulting in microbiological contamination with moulds, as well as the necessity for prompt remedial activities on the part of the employers.
Research Articles| July 21 2009 The Complement Profile in Acute Glomerulonephritis Systemic Lupus erythematosus and Hypocomplementemic Chronic Glomerulonephritis: Contrasts and Experimental Correlations Subject Area: Immunology and Allergy H. Gewurz; H. Gewurz The Pediatric Research Laboratories of the Variety Club Heart Hospital and the Department of Microbiology of the University of Minnesota, Minneapolis, Minn., and the Immunology Section, Laboratory of Microbiology, National Institute of Dental Research, National Institutes of Health, Bethesda, Md. Search for other works by this author on: This Site PubMed Google Scholar R.J. Pickering; R.J. Pickering The Pediatric Research Laboratories of the Variety Club Heart Hospital and the Department of Microbiology of the University of Minnesota, Minneapolis, Minn., and the Immunology Section, Laboratory of Microbiology, National Institute of Dental Research, National Institutes of Health, Bethesda, Md. Search for other works by this author on: This Site PubMed Google Scholar S.E. Mergenhagen; S.E. Mergenhagen The Pediatric Research Laboratories of the Variety Club Heart Hospital and the Department of Microbiology of the University of Minnesota, Minneapolis, Minn., and the Immunology Section, Laboratory of Microbiology, National Institute of Dental Research, National Institutes of Health, Bethesda, Md. Search for other works by this author on: This Site PubMed Google Scholar R.A. Good R.A. Good The Pediatric Research Laboratories of the Variety Club Heart Hospital and the Department of Microbiology of the University of Minnesota, Minneapolis, Minn., and the Immunology Section, Laboratory of Microbiology, National Institute of Dental Research, National Institutes of Health, Bethesda, Md. Search for other works by this author on: This Site PubMed Google Scholar International Archives of Allergy and Applied Immunology (1968) 34 (6): 556–570. https://doi.org/10.1159/000230149 Article history Published Online: July 21 2009 Content Tools Views Icon Views Article contents Figures & tables Video Audio Supplementary Data Peer Review Share Icon Share Facebook Twitter LinkedIn Email Tools Icon Tools Get Permissions Cite Icon Cite Search Site Citation H. Gewurz, R.J. Pickering, S.E. Mergenhagen, R.A. Good; The Complement Profile in Acute Glomerulonephritis Systemic Lupus erythematosus and Hypocomplementemic Chronic Glomerulonephritis: Contrasts and Experimental Correlations. International Archives of Allergy and Applied Immunology 1 June 1968; 34 (6): 556–570. https://doi.org/10.1159/000230149 Download citation file: Ris (Zotero) Reference Manager EasyBib Bookends Mendeley Papers EndNote RefWorks BibTex toolbar search Search Dropdown Menu toolbar search search input Search input auto suggest filter your search All ContentAll JournalsInternational Archives of Allergy and Applied Immunology Search Advanced Search Article PDF first page preview Close Modal This content is only available via PDF. 1968Copyright / Drug Dosage / DisclaimerCopyright: All rights reserved. No part of this publication may be translated into other languages, reproduced or utilized in any form or by any means, electronic or mechanical, including photocopying, recording, microcopying, or by any information storage and retrieval system, without permission in writing from the publisher.Drug Dosage: The authors and the publisher have exerted every effort to ensure that drug selection and dosage set forth in this text are in accord with current recommendations and practice at the time of publication. However, in view of ongoing research, changes in government regulations, and the constant flow of information relating to drug therapy and drug reactions, the reader is urged to check the package insert for each drug for any changes in indications and dosage and for added warnings and precautions. This is particularly important when the recommended agent is a new and/or infrequently employed drug.Disclaimer: The statements, opinions and data contained in this publication are solely those of the individual authors and contributors and not of the publishers and the editor(s). The appearance of advertisements or/and product references in the publication is not a warranty, endorsement, or approval of the products or services advertised or of their effectiveness, quality or safety. The publisher and the editor(s) disclaim responsibility for any injury to persons or property resulting from any ideas, methods, instructions or products referred to in the content or advertisements. You do not currently have access to this content.
The digitization of displaced archives is of great historical and cultural significance. Through the construction of digital humanistic platforms represented by MISS platform, and the comprehensive application of IIIF technology, knowledge graph technology, ontology technology, and other popular information technologies. We can find that the digital framework of displaced archives built through the MISS platform can promote the establishment of a standardized cooperation and dialogue mechanism between the archives authoritiess and other government departments. At the same time, it can embed the works o fichives ction of digital government and the economy, promote the exploration of the integration of archives management, data management, and information resource management, and ultimately promote the construction of a digital society. By fostering a new partnership between archives departments and enterprises, think tanks, research institutes, and industry associations, the role of multiple social subjects in the modernization process of the archives governance system and governance capacity will be brought into play. The National Archives Administration has launched a special oper
The creation of open archives i.e. archives where access is regulated by open licensing models (content, source, data), should be seen as part of a broader socio-economic phenomenon that finds legal expression in specific organizational and technical formats.This paper examines the origins and main characteristics of the open archives phenomenon. We investigate the extent to which different models of production of economic or social value can be expressed in different forms of licensing in the context of open archives. Through this process, we assess the extent to which the digital archive is moving towards providing access that is deeper (meaning, that offers more access rights) and wider (in the sense that most of the information given is in open content licensing) or face a gradual stratification and polarization of the content. Such stratification entails the emergence of two types of content: content to which access is extremely limited and content to which access remains completely open. This differentiation between classes of content is the result of multiple factors: from purely legislative, administrative and contractual restrictions (e.g. data protection and confidentiali
Microorganisms are ubiquitous in nature, and microbial activities are closely intertwined with the entire life cycle system and human life. Developing novel technologies for the detection, characterization and manipulation of microorganisms promotes their applications in clinical, environmental and industrial areas. Over the last two decades, terahertz (THz) technology has emerged as a new optical tool for microbiology. The great potential originates from the unique advantages of THz waves including the high sensitivity to water and inter-/intra-molecular motions, the non-invasive and label-free detecting scheme, and their low photon energy. THz waves have been utilized as a stimulus to alter microbial functions, or as a sensing approach for quantitative measurement and qualitative differentiation. This review specifically focuses on recent research progress of THz technology applied in the field of microbiology, including two major parts of THz biological effects and the microbial detection applications. In the end of this paper, we summarize the research progress and discuss the challenges currently faced by THz technology in microbiology, along with potential solutions. We also
The study of microorganisms, or microbiology, has demonstrated significant development since its inception and is currently a key field of biological sciences that has a huge impact on modern society and scientific research. Over the centuries, this discipline has undergone significant changes, shaping our understanding of infectious diseases and food safety. Starting from the simplest observations of microscopic organisms such as bacteria, viruses, fungi and protozoa, and ending with modern molecular and genomic research methods. This article describes a brief historical path of microbiology development. The heuristic, morphological, physiological, immunological, and molecular genetic stages are the main periods into which the development of this science is traditionally divided, despite the lack of full-fledged and precise boundaries between them.
The article examines the theoretical, methodological, and technical foundations of research on audiovisual corpora within the field of digital humanities. It outlines the main transversal issues underlying the processes of constructing, exploiting, and interpreting such corpora, which are conceived as specific forms of textual data in the broad sense - that is, as sets of semiotic traces (written, visual, sound, or multimodal) that make it possible to document, analyze, and transmit domains of knowledge. The analysis is organized around five complementary themes. The first concerns the status and structure of textual data lato sensu: any data, regardless of its medium, participates in a meaningful representation of a domain and therefore requires a unified theoretical and methodological framework based on a transdisciplinary semiotic approach. The second theme addresses the documentary value of data and corpora, understood as the relevance of materials for documenting a research object in relation to the goals and perspectives of the projects in which they are used. This value depends both on provenance and reasoned selection, and on the pragmatic context of their use. The third th
The IANEC project (Investigation of Digital Archives of Contemporary Writers), led by the GREYC Research Lab and funded by the French Ministry of Culture aims to develop dedicated digital forensic investigation tools to automate the analysis of archival corpora from the Institut M{é}moires de l'{É}dition Contemporaine (IMEC). The project is based on the observation that born-digital archival materials are increasingly prevalent in contemporary archival institutions, and that digital forensics technologies have become essential for the extraction, identification, processing, and description of natively digital archival corpora.*
The digital transformation is turning archives, both old and new, into data. As a consequence, automation in the form of artificial intelligence techniques is increasingly applied both to scale traditional recordkeeping activities, and to experiment with novel ways to capture, organise and access records. We survey recent developments at the intersection of Artificial Intelligence and archival thinking and practice. Our overview of this growing body of literature is organised through the lenses of the Records Continuum model. We find four broad themes in the literature on archives and artificial intelligence: theoretical and professional considerations, the automation of recordkeeping processes, organising and accessing archives, and novel forms of digital archives. We conclude by underlining emerging trends and directions for future work, which include the application of recordkeeping principles to the very data and processes which power modern artificial intelligence, and a more structural, yet critically-aware, integration of artificial intelligence into archival systems and practice.
Digitization of historical records has produced a significant amount of data for analysis and interpretation. A critical challenge is the ability to relate historical information across different archives to allow for the data to be framed in the appropriate historical context. This paper presents a real-world case study on historical information integration and record matching with the goal to improve the historical value of archives containing data in the period 1800 to 1920. The archives contain unique information about Métis and Indigenous people in Canada and interactions with European settlers. The archives contain thousands of records that have increased relevance when relationships and interconnections are discovered. The contribution is a record linking approach suitable for historical archives and an evaluation of its effectiveness. Experimental results demonstrate potential for discovering historical linkage with high precision enabling new historical discoveries.
This study addresses from the Optimal Experimental Design perspective the use of the isothermal experimentation procedure to precisely estimate the parameters defining models used in predictive microbiology. Starting from a case study set out in the literature, and taking the Baranyi model as the primary model, and the Ratkowsky square-root model as the secondary, D- and c-optimal designs are provided for isothermal experiments, taking the temperature both as a value fixed by the experimenter and as a variable to be designed. The designs calculated show that those commonly used in practice are not efficient enough to estimate the parameters of the secondary model, leading to greater uncertainty in the predictions made via these models. Finally, an analysis is carried out to determine the effect on the efficiency of the possible reduction in the final experimental time.
Screenshots of social media posts are a common approach for information sharing. Unfortunately, before sharing a screenshot, users rarely verify whether the attribution of the post is fake or real. There are numerous legitimate reasons to share screenshots. However, sharing screenshots of social media posts is also a vector for mis-/disinformation spread on social media. We are exploring methods to verify the attribution of a social media post shown in a screenshot, using resources found on the live web and in web archives. We focus on the use of web archives, since the attribution of non-deleted posts can be relatively easily verified using the live web. We show how information from a Twitter screenshot (Twitter handle, timestamp, and tweet text) can be extracted and used for locating potential archived tweets in the Internet Archive's Wayback Machine. We evaluate our method on a dataset of 1,571 single tweet screenshots.
Colonial archives are at the center of increased interest from a variety of perspectives, as they contain traces of historically marginalized people. Unfortunately, like most archives, they remain difficult to access due to significant persisting barriers. We focus here on one of them: the biases to be found in historical findings aids, such as indexes of person names, which remain in use to this day. In colonial archives, indexes can perpetuate silences by omitting to include mentions of historically marginalized persons. In order to overcome such limitations and pluralize the scope of existing finding aids, we propose using automated entity recognition. To this end, we contribute a fit-for-purpose annotation typology and apply it on the colonial archive of the Dutch East India Company (VOC). We release a corpus of nearly 70,000 annotations as a shared task, for which we provide baselines using state-of-the-art neural network models. Our work intends to stimulate further contributions in the direction of broadening access to (colonial) archives, integrating automation as a possible means to this end.
Although the Internet Archive's Wayback Machine is the largest and most well-known web archive, there have been a number of public web archives that have emerged in the last several years. With varying resources, audiences and collection development policies, these archives have varying levels of overlap with each other. While individual archives can be measured in terms of number of URIs, number of copies per URI, and intersection with other archives, to date there has been no answer to the question "How much of the Web is archived?" We study the question by approximating the Web using sample URIs from DMOZ, Delicious, Bitly, and search engine indexes; and, counting the number of copies of the sample URIs exist in various public web archives. Each sample set provides its own bias. The results from our sample sets indicate that range from 35%-90% of the Web has at least one archived copy, 17%-49% has between 2-5 copies, 1%-8% has 6-10 copies, and 8%-63% has more than 10 copies in public web archives. The number of URI copies varies as a function of time, but no more than 31.3% of URIs are archived more than once per month.
Laboratory case definition of leptospirosis is scarcely defined by a solid evaluation that determines cut-off values in the tests that are applied. This study describes the process of determining optimal cut-off titers of laboratory tests for leptospirosis for a valid case definition of leptospirosis. In this case the tests are the microscopic agglutination test (MAT) and an in-house IgM enzyme-linked immunosorbent assay (ELISA) both on single serum and paired samples using a positive culture as the reference test in the Dutch population. The specificity was assessed using panels of sera from healthy donors, cases with known other diseases and non-leptospirosis cases with symptoms compatible with leptospirosis. Cases were divided into three periods corroborating the acute phase (1-10 days post onset of illness (DPO)), the early convalescent (11-20 DPO) and the late convalescent phase (>20 DPO). Cut-off titers for MAT and IgM ELISA were determined as 1:160 and 1:80 respectively for all three periods. These cut-off titers combined 100% specificity with a sensitivity that changed according to the stage of disease for both tests. The low sensitivities in the early acute phase are consistent with the dynamics of the humoral immune response. IgM ELISA yielded higher sensitivities compared to MAT in the acute and early convalescent stages. Moreover, the optimal sensitivity of MAT, the gold standard was < 82%, implying that a significant part of global cases is missed by this recommended test. MAT and IgM ELISA manifested partly complementary, resulting in a higher sensitivity when combining the results of these two tests. The availability of paired samples and of adequate clinical and epidemiological data are other parameters that will significantly increase the sensitivity of laboratory confirmation. This study enables fine-tuning of the current laboratory definition towards an improved case finding and implies that solid validation of laboratory parameters for case definition will improve both the diagnosis for individual patient care and for estimating the disease burden at a worldwide scale.
A number of serious reasons will convince an increasing amount of researchers to store their relevant material in centers which we will call "language resource archives". They combine the duty of taking care of long-term preservation as well as the task to give access to their material to different user groups. Access here is meant in the sense that an active interaction with the data will be made possible to support the integration of new data, new versions or commentaries of all sort. Modern Language Resource Archives will have to adhere to a number of basic principles to fulfill all requirements and they will have to be involved in federations to create joint language resource domains making it even more simple for the researchers to access the data. This paper makes an attempt to formulate the essential pillars language resource archives have to adhere to.
Mathematical models are increasingly a part of microbiological research. Here, we share our perspective on how modeling advances the discipline by: (i) enforcing logical consistency, (ii) enabling quantitative prediction, (iii) extracting hidden parameters from data, and (iv) generating intuitive understanding. We map a spectrum of modeling frameworks, from whole-cell simulations to minimal logistic growth equations, and provide interactive examples for some common frameworks. Building on this overview, we outline pragmatic criteria for choosing an appropriate level of description to capture phenomena of interest. Finally, we present a case study in modeling of microbial ecosystems from our own work to illustrate how mechanistic modeling can yield generalizable intuition. This perspective aims to be an introductory roadmap for integrating mathematical modeling into experimental microbiology.
Phosphorus (P) is considered to be one of the key elements for life, making it an important element to look for in the abundance analysis of spectra of stellar systems. Yet, there exists only a handful of spectroscopic studies to estimate the P abundances and investigate its trend across a range of metallicities. We have observed full HK band spectra at a spectral resolving power of R=45,000 with IGRINS instrument. Abundances are determined using SME in combination with 1D MARCS stellar atmosphere models. The investigated sample of stars have reliable stellar parameters estimated using optical FIES spectra (GILD; Jönsson et al. in prep.). In order to determine the P abundances from the 16482.92 Angstrom P line, we take special care of the CO($ν=7-4$) blend. We determine the C, N, O abundances from atomic carbon and a range of non-blended molecular lines (CO, CN, OH) which are aplenty in the H band region of K giant stars, assuring an appropriate modelling of the blending CO($ν=7-4$) line. We present [P/Fe] vs [Fe/H] trend for 38 K giant stars in the metallicity range of -1.2 dex $<$ [Fe/H] $<$ 0.4 dex. We find that our trend matches well with the compiled literature sample of
We document the creation of a data set of 16,627 archived web pages, or mementos, of 3,698 unique live web URIs (Uniform Resource Identifiers) from 17 public web archives. We used four different methods to collect the dataset. First, we used the Los Alamos National Laboratory (LANL) Memento Aggregator to collect mementos of an initial set of URIs obtained from four sources: (a) the Moz Top 500, (b) the dataset used in our previous study, (c) the HTTP Archive, and (d) the Web Archives for Historical Research group. Second, we extracted URIs from the HTML of already collected mementos. These URIs were then used to look up mementos in LANL's aggregator. Third, we downloaded web archives' published lists of URIs of both original pages and their associated mementos. Fourth, we collected more mementos from archives that support the Memento protocol by requesting TimeMaps directly from archives, not through the Memento aggregator. Finally, we downsampled the collected mementos to 16,627 due to our constraints of a maximum of 1,600 mementos per archive and being able to download all mementos from each archive in less than 40 hours.
In recent years, journalists and other researchers have used web archives as an important resource for their study of disinformation. This paper provides several examples of this use and also brings together some of the work that the Old Dominion University Web Science and Digital Libraries (WS-DL) research group has done in this area. We will show how web archives have been used to investigate changes to webpages, study archived social media including deleted content, and study known disinformation that has been archived.