NS Networking Sciences

Network Science is an academic field which studies complex networks such as telecommunication networks, computer networks, biological networks, cognitive and semantic networks, and social networks, considering distinct elements or actors represented by nodes (or vertices) and the connections between the elements or actors as links (or edges). The field draws on theories and methods including graph theory from mathematics, statistical mechanics from physics, data mining and information visualization from computer science, inferential modeling from statistics, and social structure from sociology. The United States National Research Council defines network science as “the study of network representations of physical, biological, and social phenomena leading to predictive models of these phenomena.” INS is actively involved in research in this filed.

Quantum Computing Networks

Quantum Search Algorithms
1 The Deutsch Algorithm
2 The Deutsch-
Jozsa Algorithm
3 Simon’s Algorithm
4 Shor’s Algorithm
5 Quantum Phase Estimation Algorithm
6 Grover’s Quantum Search Algorithm
Boyer-Brassard-HøyerTapp Quantum Search Algorithm
Dürr-Høyer Quantum Search Algorithm
9 Quantum Counting Algorithm
10 Quantum Heuristic Algorithm
11 Quantum Genetic Algorithm
12 Harrow-Hassidim-Lloyd Algorithm
13 Quantum Mean Algorithm
14 Quantum Weighted Sum Algorithm


AI Artificial Intelligence in Networks


Artificial intelligence (AI, also machine intelligence, MI) is apparently intelligent behavior by machines, rather than the natural intelligence (NI) of humans and other animals. In computer science AI research is defined as the study of “intelligent agents“: any device that perceives its environment and takes actions that maximize its chance of success at some goal. Colloquially, the term “artificial intelligence” is applied when a machine mimics “cognitive” functions that humans associate with other human minds, such as “learning” and “problem solving”. Capabilities generally classified as AI as of 2017 include successfully understanding human speech, competing at a high level in strategic game systems, autonomous cars, intelligent routing in content delivery networks, military simulations, and interpreting complex data. INSs interest is in the field of AI applications in Network control and design.

Big Data is data sets that are so voluminous and complex that traditional data processing application software are inadequate to deal with them. Big data challenges include capturing data, data storage, data analysis, search, sharing, transfer, visualization, querying, updating and information privacy. There are three dimensions to big data known as Volume, Variety and Velocity. Lately, the term “big data” tends to refer to the use of predictive analytics, user behavior analytics, or certain other advanced data analytics methods that extract value from data, and seldom to a particular size of data set. “There is little doubt that the quantities of data now available are indeed large, but that’s not the most relevant characteristic of this new data ecosystem.” Analysis of data sets can find new correlations to “spot business trends, prevent diseases, combat crime and so on.” Scientists, business executives, practitioners of medicine, advertising and governments alike regularly meet difficulties with large data-sets in areas including Internet search, urban informatics, and business informatics. Scientists encounter limitations in e-Science work, including meteorology, genomics, connectomics, complex physics simulations, biology and environmental research. Data sets grow rapidly – in part because they are increasingly gathered by cheap and numerous information-sensing Internet of things devices such as mobile devices, aerial (remote sensing), software logs, cameras, microphones, radio-frequency identification (RFID) readers and wireless sensor networks. The world’s technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s; as of 2012], every day 2.5 exabytes (2.5×1018) of data are generated. By 2025, IDC predicts there will be 163 zettabytes of data. One question for large enterprises is determining who should own big-data initiatives that affect the entire organization.

Relational database management systems and desktop statistics- and visualization-packages often have difficulty handling big data. The work may require “massively parallel software running on tens, hundreds, or even thousands of servers”. What counts as “big data” varies depending on the capabilities of the users and their tools, and expanding capabilities make big data a moving target. “For some organizations, facing hundreds of gigabytes of data for the first time may trigger a need to reconsider data management options. For others, it may take tens or hundreds of terabytes before data size becomes a significant consideration.”

INS has developed a number off line learning algorithms using big data.

Bio Networks

A biological network is any network that applies to biological systems. A network is any system with sub-units that are linked into a whole, such as species units linked into a whole food web. Biological networks provide a mathematical representation of connections found in ecological, evolutionary, and physiological studies, such as neural networks. The analysis of biological networks with respect to human diseases has led to the field of network medicine. INS is applying the results and methodologies from this field to communication networks and Internet.

T networks (transportation)

T networks (transportation)

In graph theory, a flow network (also known as a transportation network) is a directed graph where each edge has a capacity and each edge receives a flow. The amount of flow on an edge cannot exceed the capacity of the edge. Often in operations research, a directed graph is called a network, the vertices are called nodes and the edges are called arcs. A flow must satisfy the restriction that the amount of flow into a node equals the amount of flow out of it, unless it is a source, which has only outgoing flow, or sink, which has only incoming flow. A network can be used to model traffic in a road system, circulation with demands, fluids in pipes, currents in an electrical circuit, or anything similar in which something travels through a network of nodes. NSI is applying analogies and results from communication networks design and analysis to transportation networks.

S networks (social)

A social networking service (also social networking site’, SNS or social media) is an online platform which people use to build social networks or social relations with other people who share similar personal or career interests, activities, backgrounds or real-life connections. Most social-network services are web-based and provide means for users to interact over the Internet, such as by e-mail, by instant messaging and through online forums. NSI is applying analogies and results from social networks design and analysis to communication networks.

E networks (energy networks)

An electrical grid is an interconnected network for delivering electricity from producers to consumers. It consists of generating stations that produce electrical power, high voltage transmission lines that carry power from distant sources to demand centers, and distribution lines that connect individual customers. Electrical grids vary in size from covering a single building through national grids which cover whole countries, to transnational grids which can cross continents. INS develops algorithms for electric grid stability control including pricing mechanisms.

C networks (communications)

INS is doing research in the field of: Network Sciences, Network Optimization, Topology design, SDN and Network Virtualization, Network Slicing, Routing, Network Economics, Spectrum Sharing, Business models in networking, Artificial Intelligence in Networking, Network stability control, Cognitive Networks, Network Security, Block Chain Technology, IoT, Low Latency Networks


Satellite Networks

DTN concept in space networks

     Each node of the DTN architecture can store information for a long time before forwarding it. Thanks to these features, a DTN is particularly suited to cope with the challenges imposed by space communication over the network with intermittent connectivity (visibility) and random interruptions between the nodes. The DTN concept lies in a generalization of requirements identified for interplanetary networking (IPN), where latencies that may reach the order of tens of minutes, as well as limited and highly asymmetric bandwidth, must be faced. Delays and disruptions can be handled at each DTN hop in a path between a sender and a destination. Nodes on the path can provide the storage necessary for data in transit before forwarding it to the next node on the path. In consequence, the contemporaneous end-to-end connectivity that Transmission Control Protocol (TCP) and other standard Internet transport protocols require in order to reliably transfer application data is not required. In practice, in standard TCP/IP networks, which assume continuous connectivity and short delays, routers perform nonpersistent (short term) storage and information is persistently stored only at end nodes. In DTN networks, information is persistently (longterm) stored at intermediate DTN nodes. This makes DTNs much more robust against disruptions, disconnections, and node failures.         

  INS provides designs for satellite constellation optimization that minimize latency in the network with minimum number of satellites in the orbit. In adition to providing the most economical solutions, for minimum latency in the network, this also sllows down the problem of building up the “basura espacial” which has been already recognized as a problem of the future explaration of the space

see more

It is time to take the responsibility within our own profession for the future state of the space rather than to wait for warnings from Hollywood or future Green Space movements and parties.

IPN with zero waiting for visibility (wfv) latency


Bitcoin is a worldwide cryptocurrency and digital payment system called the first decentralized digital currency, as the system works without a central repository or single administrator. It was invented by an unknown programmer, or a group of programmers, under the name Satoshi Nakamoto and released as open-source software in 2009. The system is peer-to-peer, and transactions take place between users directly, without an intermediary. These transactions are verified by network nodes and recorded in a public distributed ledger called a blockchain.

Blockchain is a public ledger that records bitcoin transactions. A novel solution accomplishes this without any trusted central authority: the maintenance of the blockchain is performed by a network of communicating nodes running bitcoin software. Transactions of the form payer X sends Y bitcoins to payee Z are broadcast to this network using readily available software applications. Network nodes can validate transactions, add them to their copy of the ledger, and then broadcast these ledger additions to other nodes. The blockchain is a distributed database – to achieve independent verification of the chain of ownership of any and every bitcoin amount, each network node stores its own copy of the blockchain.

Double-spending is an error in a digital cash scheme in which the same single digital token is spent more than once. This is possible because a digital token consists of a digital file that can be duplicated or falsified. As with counterfeit money, such double-spending leads to inflation by creating a new amount of fraudulent currency that did not previously exist. This devalues the currency relative to other monetary units, and diminishes user trust as well as the circulation and retention of the currency. Fundamental cryptographic techniques to prevent double-spending while preserving anonymity in a transaction are blind signatures and particularly in offline systems, secret splitting.             INS provides special network design protocols for early warning of double spending attempts.

updated site