top of page

Tracks & Trails

Public·45 members
Joshua Perez
Joshua Perez

Applied Computing For Software And Smart System...


Although first proposed in 1956,[32] the term "computer science" appears in a 1959 article in Communications of the ACM,[33]in which Louis Fein argues for the creation of a Graduate School in Computer Sciences analogous to the creation of Harvard Business School in 1921.[34] Louis justifies the name by arguing that, like management science, the subject is applied and interdisciplinary in nature, while having the characteristics typical of an academic discipline.[33]His efforts, and those of others such as numerical analyst George Forsythe, were rewarded: universities went on to create such departments, starting with Purdue in 1962.[35] Despite its name, a significant amount of computer science does not involve the study of computers themselves. Because of this, several alternative names have been proposed.[36] Certain departments of major universities prefer the term computing science, to emphasize precisely that difference. Danish scientist Peter Naur suggested the term datalogy,[37] to reflect the fact that the scientific discipline revolves around data and data treatment, while not necessarily involving computers. The first scientific institution to use the term was the Department of Datalogy at the University of Copenhagen, founded in 1969, with Peter Naur being the first professor in datalogy. The term is used mainly in the Scandinavian countries. An alternative term, also proposed by Naur, is data science; this is now used for a multi-disciplinary field of data analysis, including statistics and databases.




Applied Computing for Software and Smart System...



A number of computer scientists have argued for the distinction of three separate paradigms in computer science. Peter Wegner argued that those paradigms are science, technology, and mathematics.[48] Peter Denning's working group argued that they are theory, abstraction (modeling), and design.[49] Amnon H. Eden described them as the "rationalist paradigm" (which treats computer science as a branch of mathematics, which is prevalent in theoretical computer science, and mainly employs deductive reasoning), the "technocratic paradigm" (which might be found in engineering approaches, most prominently in software engineering), and the "scientific paradigm" (which approaches computer-related artifacts from the empirical perspective of natural sciences,[50] identifiable in some branches of artificial intelligence).[51]Computer science focuses on methods involved in design, specification, programming, verification, implementation and testing of human-made computing systems.[52]


Scientific computing (or computational science) is the field of study concerned with constructing mathematical models and quantitative analysis techniques and using computers to analyze and solve scientific problems. A major usage of scientific computing is simulation of various processes, including computational fluid dynamics, physical, electrical, and electronic systems and circuits, as well as societies and social situations (notably war games) along with their habitats, among many others. Modern computers enable optimization of such designs as complete aircraft. Notable in electrical and electronic circuit design are SPICE,[60] as well as software for physical realization of new (or modified) designs. The latter includes essential design software for integrated circuits.[61]


Artificial intelligence (AI) aims to or is required to synthesize goal-orientated processes such as problem-solving, decision-making, environmental adaptation, learning, and communication found in humans and animals. From its origins in cybernetics and in the Dartmouth Conference (1956), artificial intelligence research has been necessarily cross-disciplinary, drawing on areas of expertise such as applied mathematics, symbolic logic, semiotics, electrical engineering, philosophy of mind, neurophysiology, and social intelligence. AI is associated in the popular mind with robotic development, but the main field of practical application has been as an embedded component in areas of software development, which require computational understanding. The starting point in the late 1940s was Alan Turing's question "Can computers think?", and the question remains effectively unanswered, although the Turing test is still used to assess computer output on the scale of human intelligence. But the automation of evaluative and predictive tasks has been increasingly successful as a substitute for human monitoring and intervention in domains of computer application involving complex real-world data.


Applied computing addresses the real-world problems in the applications of computer technologies. It considers both theoretical and applied computer science, from hardware to software, and is a multidisciplinary study to integrate computer/information technology with a second discipline. Machine artificial intelligence is one of the most important and promising technologies for applied computing and makes computers learn from data using different techniques in various applications.


This Topic Issue (TI) focuses on the latest ideas, solutions, or developments of applied computing using machine intelligence. The topics of interest include all kinds of computing applications using machine/artificial intelligence techniques as well as theoretical. The related areas for computation and technology are information management, systems, networking, programming, software engineering, mobile technology, graphic applications and visualization, data integration, security, and artificial intelligence. As this juncture sits at the intersection of multiple disciplines, it has a wide range of applications, including finance, retail, education, healthcare, agriculture, navigation, lifestyle, manufacturing, etc.


Mitropoulos, S. and Douligeris, C. (2021), "Why and how informatics and applied computing can still create structural changes and competitive advantage", Applied Computing and Informatics, Vol. ahead-of-print No. ahead-of-print. -06-2021-0149


Nevertheless, there are many enterprises, which, despite investing a significant amount of money and human resources in applied computing, as well as in relevant research, have failed in having the initially expected advantages. This happened because these enterprises failed to adopt and then successfully implement the new e-business and e-governance models [2]. In the context of rapidly responding to market needs and to their fierce competition, many enterprises have only pursued short-term benefits from applied computing and informatics by trying only to accelerate the development of new services and products at the lowest possible operating costs. But ignoring the fundamentals of strategy formulation has led to a convergence of business practices based on cost leadership [3], something we have seen happening in the past with the dotcom enterprises. This problem still exists with the latest informatics technologies, where we see that enterprises still do not incorporate the informatics and computing applications in their processes in the right way [2, 4, 5]. The main faults that enterprises make regarding the adoption of informatics are that they do not:synchronize their IT strategy with the goals of their business strategy,


The exponential growth of ubiquitous computing drives the need for new business models, which must serve an effective IT-enabled business strategy directly related to the Digital Transformation. The problem space here is how the effective adoption of new IT technologies can be achieved which in turn drives the requirements of renewed business strategy and processes, and of culture change [29]. The main problem which modern enterprises face concerns the right IT implementation along with the adaptation of the right strategy, process, and culture. This paper tries to answer all these requirements in a consistent and methodological way. Even though a considerable body of research towards this direction exists, there is still a gap for the development of a perfect alignment between the strategy the enterprises must follow and an intelligent and effective way the new technologies are implemented so that they create a competitive advantage [5]. Internet, cloud computing, mobile computing, software-oriented architectures (SOA), Internet of things (IoT), blockchain, server virtualization and other modern technologies provide the enterprises with the opportunity to migrate from the traditional business models to new ones [30, 31, 35, 36].


Many researchers argue that ΙΤ no longer offers any innovation over the competition and have, therefore, ΙΤ has reached the stage of maturing as a service [6]. This viewpoint has been expressed without considering its high and disruptive evolution. Τhere is obviously a part of IT that is common to almost all enterprises, making this utility approach workable. Nevertheless, this is only one side of the coin because IT is creating new situations that accelerate the developments in the operation of markets and enterprises. The concept of ubiquitous computing, for example, is a new trend which will surely cause structural changes, while new ideas in virtual communities can create significant changes in corporate collaboration by forming on-demand virtual business partner hyperarchies. In addition, new intelligent algorithms for machine learning, artificial intelligence and biotechnology are being developed, thus facilitating research into new products (e.g. drugs and crystal solid materials) and services (e.g. telemedicine) with significant business and social benefits [7]. Thus, the new information technologies can leverage the capacity for innovation. Enterprises through the smart and targeted use of IT will be able to create growth, and innovative products and services at the right time.


Service grids will be enhanced with the new ubiquitous computing capabilities. For example, mobile computing can effectively leverage the quality of service, as well as the collaboration between employers, employees, customers, strategic partners, and third parties [25]. Furthermore, smartphones can provide a variety of functionalities, like user interaction, task management, user online help, blogging, wiki, chatting, and remote access from trusteed parties or customers through appropriate authentication and authorizations, that can significantly leverage the operational effectiveness of modern enterprises. The dissemination of notifications (e.g. Google Cloud Messages) in mobile apps is another example which can facilitate business operations. 041b061a72


About

Welcome to the group! You can connect with other members, ge...

Members

  • Jesus Galkin
    Jesus Galkin
  • Ezekiel Price
    Ezekiel Price
  • Kyle Cooper
    Kyle Cooper
  • Aaron Levesque
    Aaron Levesque
  • mdhrunstrong
bottom of page