共找到 20 条结果
The ILC Technology Network (ITN) was established in 2022 by the ILC International Development Team, a subcommittee of the International Committee for Future Accelerators, to advance engineering studies toward the realisation of the International Linear Collider (ILC). While the ITN work packages focus on engineering activities for the ILC, their topics are also relevant to a broad range of accelerator applications in particle physics and beyond. These work packages are being carried out now by laboratories in Asia and Europe in close collaboration. This report summarises the current status of the ITN activities.
As belief around the potential of computational social science grows, fuelled by recent advances in machine learning, data scientists are ostensibly becoming the new experts in education. Scholars engaged in critical studies of education and technology have sought to interrogate the growing datafication of education yet tend not to use computational methods as part of this response. In this paper, we discuss the feasibility and desirability of the use of computational approaches as part of a critical research agenda. Presenting and reflecting upon two examples of projects that use computational methods in education to explore questions of equity and justice, we suggest that such approaches might help expand the capacity of critical researchers to highlight existing inequalities, make visible possible approaches for beginning to address such inequalities, and engage marginalised communities in designing and ultimately deploying these possibilities. Drawing upon work within the fields of Critical Data Studies and Science and Technology Studies, we further reflect on the two cases to discuss the possibilities and challenges of reimagining computational methods for critical research in
The answers on the current status and future development of Quantum Science and Technology are presented.
This position paper encourages the Human-Computer Interaction (HCI) community to focus on designing deliberative processes to inform and coordinate technology and policy design for large language models (LLMs) -- a `societal-scale technology'. First, I propose a definition for societal-scale technology and locate LLMs within this definition. Next, I argue that existing processes to ensure the safety of LLMs are insufficient and do not give the systems democratic legitimacy. Instead, we require processes of deliberation amongst users and other stakeholders on questions about the safety of outputs and deployment contexts. This shift in AI safety research and practice will require the design of corporate and public policies that determine how to enact deliberation and the design of interfaces and technical features to translate the outcomes of deliberation into technical development processes. To conclude, I propose roles for the HCI community to ensure deliberative processes inform technology and policy design for LLMs and other societal-scale technology.
Quantitative technology forecasting uses quantitative methods to understand and project technological changes. It is a broad field encompassing many different techniques and has been applied to a vast range of technologies. A widely used approach in this field is trend extrapolation. Based on the publications available to us, there has been little or no attempt made to systematically review the empirical evidence on quantitative trend extrapolation techniques. This study attempts to close this gap by conducting a systematic review of technology forecasting literature addressing the application of quantitative trend extrapolation techniques. We identified 25 studies relevant to the objective of this research and classified the techniques used in the studies into different categories, among which growth curves and time series methods were shown to remain popular over the past decade, while newer methods, such as machine learning-based hybrid models, have emerged in recent years. As more effort and evidence are needed to determine if hybrid models are superior to traditional methods, we expect to see a growing trend in the development and application of hybrid models to technology for
This short paper provides a means to classify augmentation technologies to reconceptualize them as sociotechnical, discursive and rhetorical phenomena, rather than only through technological classifications. It identifies a set of value systems that constitute augmentation technologies within discourses, namely, the intent to enhance, automate, and build efficiency. This short paper makes a contribution to digital literacy surrounding augmentation technology emergence, as well as the more specific area of AI literacy, which can help identify unintended consequences implied at the design stages of these technologies.
State-of-the-art hardware compilers for FPGAs often fail to find efficient mappings of high-level designs to low-level primitives, especially complex programmable primitives like digital signal processors (DSPs). New approaches apply sketch-guided program synthesis to more optimally map designs. However, this approach has two primary drawbacks. First, sketch-guided program synthesis requires the user to provide sketches, which are challenging to write and require domain expertise. Second, the open-source SMT solvers which power sketch-guided program synthesis struggle with the sorts of operations common in hardware -- namely multiplication. In this paper, we address both of these challenges using an equality saturation (eqsat) framework. By combining eqsat and an existing state-of-the-art program-synthesis-based tool, we produce Churchroad, a technology mapper which handles larger and more complex designs than the program-synthesis-based tool alone, while eliminating the need for a user to provide sketches.
Digital technologies that help people take care of their dogs are becoming more widespread. Yet, little research explores what the role of technology in the human-dog relationship should be. We conducted a qualitative study incorporating quantitative and thematic analysis of 155 UK dog owners reflecting on their daily routines and technology's role in it, disentangling the what-where-why of interspecies routines and activities, technological desires, and rationales for technological support across common human-dog activities. We found that increasingly entangled daily routines lead to close multi-species households where dog owners conceptualize technology as having a role to support them in giving care to their dogs. When confronted with the role of technology across various activities, only chores like cleaning up after their dogs lead to largely positive considerations, while activities that benefit themselves like walking together lead to largely negative considerations. For other activities, whether playing, training, or feeding, attitudes remain diverse. In general, across all activities both a nightmare scenario of technology taking the human's role and in doing so disentang
Our society has been increasingly witnessing a number of negative, unintended consequences of digital technologies. While post-hoc policy regulation is crucial in addressing these issues, reasonably anticipating the consequences before deploying technology can help mitigate potential harm to society in the first place. Yet, the quest to anticipate potential harms can be difficult without seeing digital technologies deployed in the real world. In this position paper, we argue that anticipating unintended consequences of technology can be facilitated through creativity-enhancing interventions, such as by building on existing knowledge and insights from diverse stakeholders. Using lessons learned from prior work on creativity-support tools, the HCI community is uniquely equipped to design novel systems that aid in anticipating negative unintended consequences of technology on society.
Technological innovation is an important aspect of teaching and learning in the 21st century. This article examines faculty attitudes toward technology use in the classroom at one regional public university in the United States. Building on a faculty-led initiative to develop a Community of Practice for improving education, this study used a mixed-method approach of a faculty-developed, electronic survey to assess this topic. Findings from 72 faculty members revealed an overall positive stance toward technology in the classroom and the average faculty member utilized about six technology tools in their courses. The opportunities, barriers and future uses for technologies in the higher education classroom emerged from the open-ended questions on the survey. One finding of particular concern is that faculty are fearful that technology causes a loss of the humanistic perspective in education. The university is redesigning ten of its most popular courses to increase flexibility, accessibility and student success.
Persuasion is part and parcel of human interaction. The human persuaders in society have been always exit, masters of rhetoric skilled of changing our minds, or at least our behaviors. Leaders, mothers, salesmen, and teachers are clear examples of persuaders. Persuaders often turn to technology and digital media to amplify their persuasive ends. Besides, our lives and how we lead them influenced by technologies and digital media,but for the most part, their effects on our attitudes and behaviors have been incidental, even accidental. Although, nowadays, the use of computers to sell products and services considered as the most frequent application of persuasive technology. In this short paper, based on an extensive review of literatures, we aim to give a brief introduction to persuasive technology, and how it can play a role and contribute to enhance and deliver the best practice of IT. Some challenges of persuasive technology have been discussed. At the end, some recommendations and steps should be taken place to empower IT professional practices have been listed.
We survey the current state of phase change memory (PCM), a non-volatile solid-state memory technology built around the large electrical contrast between the highly-resistive amorphous and highly-conductive crystalline states in so-called phase change materials. PCM technology has made rapid progress in a short time, having passed older technologies in terms of both sophisticated demonstrations of scaling to small device dimensions, as well as integrated large-array demonstrators with impressive retention, endurance, performance and yield characteristics. We introduce the physics behind PCM technology, assess how its characteristics match up with various potential applications across the memory-storage hierarchy, and discuss its strengths including scalability and rapid switching speed. We then address challenges for the technology, including the design of PCM cells for low RESET current, the need to control device-to-device variability, and undesirable changes in the phase change material that can be induced by the fabrication procedure. We then turn to issues related to operation of PCM devices, including retention, device-to-device thermal crosstalk, endurance, and bias-polarity
5G New Radio (NR) technology operating in millimeter wave (mmWave) band is expected to be utilized in areas with high and fluctuating traffic demands such as city squares, shopping malls, etc. The latter may result in quality of service (QoS) violations. To deal with this challenge, 3GPP has recently proposed NR unlicensed (NR-U) technology that may utilize 60 GHz frequency band. In this paper, we investigate the deployment of NR-U base stations (BS) simultaneously operating in licensed and unlicensed mmWave bands in presence of competing WiGig traffic, where NR-U users may use unlicensed band as long as session rate requirements are met. To this aim, we utilize the tools of stochastic geometry, Markov chains, and queuing systems with random resource requirements to simultaneously capture NR-U/WiGig coexistence mechanism and session service dynamics in the presence of mmWave-specific channel impairments. We then proceed comparing performance of different offloading strategies by utilizing the eventual session loss probability as the main metric of interest. Our results show non-trivial behaviour of the collision probability in the unlicensed band as compared to lower frequency syst
Authentication plays a significant part in dealing with security in public and private sectors such as healthcare systems, banking system, transportation system and law and security. Biometric technology has grown quickly recently, especially in the areas of artificial intelligence and identity. Formerly, authentication process has depended on security measures like passcodes, identity fobs, and fingerprints. On the other hand, as just a consequence of these precautions, theft has increased in frequency. In response, biometric security was created, in which the identification of a person is based on features derived from the physiological and behavioral traits of a human body using biometric system. Biometric technology gadgets are available to the public as they are embedded on computer systems, electronic devices, mobile phones, and other consumer electronics. As the fraudulent is increasing demand and use of biometric electronic devices has increased. As a consequence, it may be possible to confirm a person's distinct identification. The goal of this study is to examine developments in biometric systems in the disciplines of medicine and engineering. The study will present the p
"Cognizing" (e.g., thinking, understanding, and knowing) is a mental state. Systems without mental states, such as cognitive technology, can sometimes contribute to human cognition, but that does not make them cognizers. Cognizers can offload some of their cognitive functions onto cognitive technology, thereby extending their performance capacity beyond the limits of their own brain power. Language itself is a form of cognitive technology that allows cognizers to offload some of their cognitive functions onto the brains of other cognizers. Language also extends cognizers' individual and joint performance powers, distributing the load through interactive and collaborative cognition. Reading, writing, print, telecommunications and computing further extend cognizers' capacities. And now the web, with its network of cognizers, digital databases and software agents, all accessible anytime, anywhere, has become our 'Cognitive Commons,' in which distributed cognizers and cognitive technology can interoperate globally with a speed, scope and degree of interactivity inconceivable through local individual cognition alone. And as with language, the cognitive tool par excellence, such technolo
This paper presents a general theory that aims at explaining timescales observed empirically in technology transitions and predicting those of future transitions. This framework is used further to derive a theory for exploring the dynamics that underlie the complex phenomenon of irreversible and path dependent price or policy induced efficiency changes. Technology transitions are known to follow patterns well described by logistic functions, which should more rigorously be modelled mathematically using the Lotka-Volterra family of differential equations, originally developed to described the population growth of competing species. The dynamic evolution of technology has also been described theoretically using evolutionary dynamics similar to that observed in nature. The theory presented here joins both approaches and presents a methodology for predicting changeover time constants in order to describe real systems of competing technologies. The problem of price or policy induced efficiency changes becomes naturally explained using this framework. Examples of application are given.
This paper introduces a new technology for a high-speed, high-power mobile form-factor tuner utilizing gas discharge tube plasma cells as switching components. To the best of our knowledge, this represents the first plasma-enabled RF matching network. Technology development is reviewed, the fabrication and measurement of a proof-of-concept switched stub impedance tuner are presented, and techniques for improvement are discussed. The proof-of-concept impedance tuner functions with a 27% bandwidth from 3 to almost 4 GHz and shows a power gain better than -2.5 dB across all switching state-frequency combinations at a 50 W input power level with spread coverage of the Smith chart. State change transient timing is measured to be on the order of 500 ns. This technology demonstration highlights the potential of miniaturized, rapidly-tunable, high-power, plasma-based RF devices.
Wireless networking is rapidly growing and becomes an inexpensive technology which allows multiple users to simultaneously access the network and the internet while roaming about the campus. In the present work, the software development of a wireless LAN(WLAN) is highlighted. This WLAN utilizes direct sequence spread spectrum (DSSS) technology at 902MHz RF carrier frequency in its physical layer. Cost effective installation and antijaming property of spread spectrum technology are the major advantages of this work.
Over the past few years, we have seen several cyber incidents being reported, where some of the primary causes were the lack of proper security controls onboard the ship and crew awareness on cybersecurity. In response to the growing cyber threat landscape in the maritime sector, we have developed a set of guidelines for maritime cyber risk management, focusing on four major shipboard Operational Technology (OT) systems that are crucial for the day-to-day operation of ships. These four OT systems are: Communication Systems, Propulsion, Machinery and Power Control Systems, Navigation Systems and Cargo Management Systems. The guidelines identify the cyber risks in each of the OT systems and recommend the necessary actions that can be taken to manage risks in each shipboard OT system. In this paper, we introduce the new guidelines, which include cyber risks, mitigation measures, cyber risk assessment, and a checklist to help shipowners and maritime authorities assess and enhance cyber hygiene of their vessels. Our guidelines have been disseminated by the Maritime and Port Authority of Singapore (MPA) to owners and operators of the Singapore Registry of Ships for their reference and us
Twenty-five years ago, Joel Reidenberg argued that technology itself, not just law and regulation, imposes rules on communities in the Information Society. System design choices like network architecture and configurations create regulatory norms he termed "Lex Informatica"-referencing the merchant-driven medieval "Lex Mercatoria" that emerged independent of sovereign control. Today we face different challenges requiring us to revisit Reidenberg's insights and examine the consequences of that earlier era. While Lex Informatica provided a framework for analyzing the internet's birth, we now confront the aftereffects of decades of minimal or absent regulation. Critical questions emerge: When technological social norms develop outside clear legal restraints, who benefits and who suffers? This new era demands infrastructural reform focused on the interplay between public and private regulation and self-regulation, weighing both costs and benefits. Rather than showcasing the promise of yesterday's internet age, today's events reveal the pitfalls of information libertarianism and underscore the urgent need for new approaches to information regulation. This Issue presents articles from tw