Day 1 :
- Wireless and Telecommunication
University of Duisburg-Essen, Germany
Mohammed Bawatna received a M.Sc. in Electronic and Communication Engineering (2015) from AL-QUDS University in East-Jerusalem (Palestine). Since 2016, he started his Ph.D. at the Central Institute of Engineering, Electronics and Analytics institute (ZEA 2) of the Forschungszentrum Jülich GmbH in Germany. The Ph.D. is supervised by Prof. van Waasen, and performed within the BMBF funded project ‘Characterization, monitoring and modelling of landslide-prone hillslopes (CMM-SLIDE)’. In cooperation with partners from IBG-3, Forschungszentrum Jülich GmbH a near real-time monitoring system to improve early warning of slope instability shall be developed. His primary research interests include Networking, and Wireless Communication.
Recently, many applications that require the Wireless Underground Sensors Network (WUSN) technology has emerged, such as automatic smart irrigation systems and monitoring underground water/gas pipelines. Creating a robust WUSN in a heterogeneous soil medium is one of the most difficult challenges that face both of the researchers and developers, due to the extreme path loss, reduced propagation velocity, reflection, scattering, noise, and multipath fading.
The main objective of this research is to monitor the landslide prone hillslopes in the hostile environments based on the technology of WUSNs and the ZEA-2 intelligent Network Operating Device (iNODE) system. Therefore, we will focus on the efficiency and the quality of the wireless underground channels at sub-GHz RF bands, in particular: the total attenuation due to reflection and propagation effects in the dry and wet soil. The development in the Physical and Datalink layers requires the measuring of Power Delay Profile (PDP), and investigate the problem of fluctuation. Therefore, a Testbed measurements were taken in the Electromagnetic Compatibility (EMC) laboratory for further accuracy. Validations of the experimental channel measurements with the numerical estimation in Bogena et. al. model show a good approximation. In addition, we include the numerical estimation for underground-to-underground channels, and the antennas gain effect. We provide these results, that we will present, not only to be the guidelines for the re-design of more reliable and dependable matching Filters, selected antennas and the required power to transmit but also to pave the ways for future studies in underground beamforming and antenna diversity.
Naoki Wakamiya received Ph.D. from Osaka University in 1996. He is now a Professor of Graduate School of Information Science and Technology, Osaka University. His research interests include Bio-ICT (Biologically-inspired Information and Communication Technology) and self-organizing network control. He received the 2nd IEEE ComSoc Asia-Pacific Young Researcher Award in 2005.
Wireless sensor networks are now widely adpoted in various application fields as one of fundamental technologies of IoT. However, energy consumption, device cost, and management cost remain impediment to deployment and long-term operation of a considerable number of sensors. To solve the problem, we propose a novel architecture of a wireless sensor network by being inspired from communication and computation in the brain. A neural network consists of very simple devices, i.e. neurons. Neurons only receive and emit a series of impulses or spike trains on a randomly connected network. Despite the simpleness, communication and computation in the brain are known effective, adaptive, and robust. Our impulse-based WSN performs all of sensing, signal processing, and communication by using implues. A sensor node is equipped with a simple binary sensor, which generates impluses when an observed value exceeds a certain threshold. Next, impulses from a sensor and neighbor nodes are processed by a simple circuit as a living neuron does. Then, a node broadcasts impulse signals to neigbhors. In our impulse-based WSN, each node does not have an identifier. In addition, there is no energy-consuming control such as topology management and routing. However, by using brain-inspired algorithms, we can derive information about an event observed by sensors, i.e. when, where, and what happens. We introduce our impulse-based WSN which enables easy deployment and long-term monitoring with high spatial resolution and show some examples of application.
Ariel University, ISRAEL
Prof. Yosef Pinhasi is the Dean of the Faculty of Engineering at the Ariel University. He was born on May 3, 1961, received the B.Sc., M.Sc. and Ph.D. degrees in electrical engineering from Tel-Aviv University, Israel in 1983, 1989 and 1995 respectively. Prof. Pinhasi investigates generation and utilization of electromagnetic waves in a wide range of frequencies, for various applications such as communications, remote sensing and imaging. The space-frequency approach, which developed by him, is employed to study propagation of wide-band signals in absorptive and dispersive media in broadband communication links, and wireless indoor and outdoor networks as well as in remote sensing radars and radiative power beaming operating in the millimeter and Tera-Hertz regimes. Prof. Pinhasi is an active radio amateur (call sign: 4Z1VC).
The demand for broadband wireless communication links and the lack of open bands in the ‘conventional’ electromagnetic spectrum, cause searching for bandwidths in the Extremely High Frequencies (EHF) above 30GHz. Realization of wireless links in the millimeter wave regime offers many advantages for communications and radar systems, including broad bandwidths for high data rate information transfer, high longitudinal and spatial resolutions and small aperture antennas and equipment size. However it faces several challenges due to the low power of available solid-state transmitters, low sensitivity of receivers and also absorptive and dispersive effects emerging during millimeter wave propagation through the atmosphere. When millimeter-wave radiation passes through the atmosphere, it suffers from selective molecular absorption, mainly oxygen and water vapors. The availability and reliability of terrestrial wireless links in the EHF are determined by weather conditions such as temperature, pressure and humidity. Attenuation due to fog, haze, clouds, rain and snow is one of the dominant causes of fading.
The unique features of millimeter wave communications are discussed; including study of wireless links operating in the vicinity of peaks and dips in the atmospheric EHF transmission. Utilization of small peak-to-average power ratio modulation techniques, such as constant envelope waveforms and algorithms for compensating group-delay distortions in the received signal, are presented. Theoretical analysis and experimental result of wideband indoor and outdoor millimeter wave transmissions are shown.
University of Cambridge, UK
Nikita Hari is a Doctoral Scholar in Electrical Engineering from University of Cambridge, UK. She is a gold medallist for both her Bachelors in Electronics and Instrumentation from CUSAT University (2003) and Masters in Power Electronics from SRM University, Chennai, India (2011). She worked as a lecturer at National Institute of Technology, Calicut and then as a Visiting Research Associate at Indian Institute of Technology, Delhi.
She is currently working as a supervisor tutoring Electrical Engineering course at the Engineering department of University of Cambridge for Churchill College. She is also the Co-founder of Favalley , serves as the Chairperson of EPSRC Centre for Power Electronics-UK, Secretary-IEEE , Steering committee Camawise , Advisory board member for Xter Tech Labs & Educational Consultant at iQdemic . Her current research interests include GaN power electronics and custom power electronic devices. She has published in international journals & conferences and is a prominent invited speaker for STEM and Science outreach events in UK and India.
The world deals day in and day out with electrical power — billions of kWh of power is delivered from wall outlets to power our electronic devices all with the help of ‘power cord’ .World is becoming smarter every day and the need to cut this cord and go wireless which was envisioned by Tesla years ago is finally set to become a reality now.
From mobile computing & communications, vehicle charging to medical equipemts and implants , wirless technology is drawing centre stage. The pressing demands for convenience, flexibility and safety has created a strong desire to make this technology real .But the problems regarding implementation are many and my talk focuses on the heart of the wireless system –the amplifier using the novel Gallium nitride based devices.
In my talk, I will walk you through the wireless power landscape, introducing the GaN RF space, state of the art amplifiers, benefits of GaN wireless technology ,potential applications, review the applicability of these devices in the real world and finally discuss whether they will compete, coexist with other age old technologies to unleas the era of wireless power transfer.
David Waite is a Managing Director at inCode Consulting, a division of Ericsson, where he leads the Strategy and Economics practice. He has over fifteen years of management consulting and corporate strategy experience with companies and clients across the information, communications, and technology segments. Over the past several years, David has delivered multiple 5G strategy and economics projects for Tier 1, 2, and 3 operators in North America and abroad.
Before joining inCode, David was a Director at Altman Vilandrie & Co., where he delivered growth strategy, due-diligence, and go-to-market strategies for clients throughout the communications ecosystem. Earlier at A.T. Kearney, David led consulting engagements as a Manager within the Financial Management practice. He also has held senior-level corporate strategy roles at two leading mobile infrastructure companies where he defined market entry and growth strategies to address emerging mobile internet opportunities. David holds an MBA from the MIT Sloan School of Management, an MS in Engineering from Dartmouth College, and a BS in Engineering from the University of New Hampshire.
5G is a new generation of high-capacity wireless technology intended to serve fixed broadband and high-throughput mobile broadband use cases. With peak speeds exceeding 1 Gbps and latencies on the order of 1 ms, 5G will complement rather than replace previous generations of wireless technology with fiber-speed performance. The panelist has evaluated 5G economics for both residential and enterprise use cases. These assessments have concluded that 5G can be a cost-effective broadband alternative for operators to satisfy growing bandwidth demand among residential and business users.
Recent 5G engagements include:
- 5G residential fixed wireless access (FWA) investment case for Tier-1 MNO
- Enterprise 5G strategy and economics for Tier-1 MNO
- Hybrid 28 GHz/3.5 GHz investment case for an international operator
- 3.5 GHz rural fixed access deployment for a Tier 1 MNO
- IoT, eMBB, and FWA strategy and investment case for an overseas operator
Prof. Monika Pinchas is with the Ariel University. She is the head of the graduate program at the faculty of Electrical and Electronic Engineering. Her research interests are in the area of Blind Equalization, Frequency Synchronization in OFDM systems and Network Synchronization. Where she has published several papers in leading journals and published two books. She served in the past as the CTO at Resolute Networks, managed the hardware group at Radiotel wireless transmission line of products, leading the development of modem technology, worked at Tadiran Communication where she was recognized as an expert and worked for Scitex in the design and implementation of hardware systems.
In modern computer networks time synchronization is critical because every aspect of managing, securing, planning, and debugging a network involves determining when events happen. Time also provides the only frame of reference between all devices on the Network. Without synchronized time, accurately correlating log files between these devices is difficult, even impossible. It is needless to point out that with time synchronization we can distribute frequency via the Network which is needed in the Telecom Industry (it is a requirement from the telco side to allow use of e.g. carrier Ethernet instead of SDH). GPS, NTP (Network Time Protocol) and IEEE1588 v2 (PTP) are the options for time distribution over the Network. PTP is designed for local systems requiring accuracies beyond those attainable using NTP. It is also designed for applications that cannot bear the cost of a GPS receiver at each node, or for which GPS signals are inaccessible. The timing accuracy of the PTP algorithm depends strongly on the packet delay variation (PDV) existing in the Network. Different paths in the Network may lead to lower PDVs thus to improved timing accuracy. In this work we show the timing performance accuracy of the PTP slave where other PTP slaves (all synchronized to the same master) are assisting to the PTP slave timing synchronization task and where the PDV is modeled with the fractional Gaussian noise (fGn) with a Hurst exponent in the region of 0.5<H<1 corresponding to the case of long- range dependency.
FIU School of Computing and Information Sciences
S. S. Iyengar is a Distinguished Ryder Professor and Director of the School of Computing and Information Sciences at the Florida International University and is the founding Director of the FIU-Discovery Lab. Iyengar is a pioneer in the field of distributed sensor networks/sensor fusion, computational aspects of robotics and high performance computing. Iyengar has published over 500 research papers and has authored/co-authored/edited 20 books published by MIT Press, John Wiley & Sons, Prentice Hall, CRC Press, Springer Verlag, etc. These publications have been used in major universities all over the world.
His research publications are on the design and analysis of efficient algorithms, parallel computing, sensor networks, and robotics. He is also a member of the European Academy of Sciences, a Fellow of IEEE, a Fellow of ACM, a Fellow of AAAS, Fellow of NAI, and a Fellow of Society of Design and Process Program (SPDS), Fellow of Institution of Engineers (FIE), awarded a Distinguished Alumnus Award of the Indian Institute of Science, Bangalore, and was awarded the IEEE Computer Society Technical Achievement for the contributions to sensor fusion algorithms, and parallel algorithms. He is a Golden Core member of the IEEE-CS and he has received a Lifetime Achievement Award conferred by the International Society of Agile Manufacturing (ISAM) in recognition of his illustrious career in teaching, research, and a lifelong contribution to the fields of Engineering and Computer Science at Indian Institute of Technology (BHU). Iyengar and Nulogix were awarded in the 2012 Innovation 2 Industry (i2i) Florida competition. Iyengar received Distinguished Research Award from Xaimen University, China for his research in Sensor Networks, Computer Vision and Image Processing Iyengar's landmark contributions with his research group is the development of grid coverage for surveillance and target location in distributed sensor networks and Brooks Iyengar fusion algorithm.
He has also been awarded honorary Doctorate of Science and Engineering from an institution. He serves on the advisory board of many corporations and universities in the world. He has served on many National Science Boards such as NIH - National Library of Medicine in Bioinformatics, National Science Foundation review panel, NASA Space Science, Department of Homeland Security, Office of Naval Security, and many others. His contribution was a centerpiece of this pioneering effort to develop image analysis for our science and technology and to the Goals of the US Naval Research Laboratory. The impact of his research contributions can be seen in companies/National Labs like Raytheon, Telecordia, Motorola, the United States Navy, DARPA agencies, etc. His contribution in DARPAS's program demonstration with BBN, Cambridge, Massachusetts, MURI, researchers from PSU/ARL, Duke, University of Wisconsin, UCLA, Cornell university and LSU.
Brooks–Iyengar algorithm is a seminal work and a major milestone in distributed sensing, and could be used as a fault tolerant solution for many redundancy scenarios. Also, it is easy to implement and embed in any networking systems. In 1996, the algorithm was used in MINIX to provide more accuracy and precision, which leads to the development of the first version of RT-Linux. In 2000, the algorithm was also central to the DARPA SensIT program’s distributed tracking program. Acoustic, seismic and motion detection readings from multiple sensors are combined and fed into a distributed tracking system. Besides, it was used to combine heterogeneous sensor feeds in the application fielded by BBN Technologies, BAE systems, Penn State Applied Research Lab(ARL), and USC/ISI.
Besides, the Thales Group, an UK Defense Manufacturer, used this work in its Global Operational Analysis Laboratory. It is applied to Raytheon’s programs where many systems need extract reliable data from unreliable sensor network, this exempts the increasing investment in improving sensor reliability. Also, the research in developing this algorithm results in the tools used by the US Navy in its maritime domain awareness software.
In education, Brooks–Iyengar algorithm has been widely used in teaching classes such as University of Wisconsin, Purdue, Georgia Tech, Clemson University, University of Maryland, etc.
In addition to the area of sensor network, other fields such as time-triggered architecture, safety of cyber-physical systems, data fusion, robot convergence, high-performance computing, software/hardware reliability, ensemble learning in artificial intelligence systems could also benefit from Brooks–Iyengar algorithm.
IHP, Frankfurt (Oder), Germany
Najib Odhah received the BSc, MSc, and PhD from the Faculty of Electronic Engineering, Menouﬁa University, Menouf, Egypt, in 2002, 2009 and 2013, respectively. From 2013 to 2015, he was an assistant professor in the electrical communications department, engineering faculty, Ibb university, Yemen. He is currently working as Postdoc in the institute of high performance (IHP) microelectronics, Frankfurt (Oder), Germany. His research areas of interest include wireless communications, signal processing techniques for wireless communications, multiple antennas techniques for wireless communications, Multiple-Input Multiple-Output (MIMO) techniques (traditional, cooperative, and massive), power allocation algorithms for Orthogonal Frequency Division Multiplexing (OFDM) Systems (traditional and cognitive), Radio Resource Management (RRM) for wireless communications, digital signal processing, digital communications, Interference Cancellation and Interference Coordination (ICIC) techniques, channel estimation and equalization, game theory based optimization techniques for future 5G wireless communication systems, Machine-to-Machine (M2M) communications, Device-to-Device (D2D) communications, and green wireless communications.
Mário Marques da Silva is an Associate Professor and the Director of the Department of Sciences and Technologies at Universidade Autónoma de Lisboa. He is also a Researcher at Instituto de Telecomunicações, in Lisbon, Portugal. He received his B.Sc in Electrical Engineering in 1992, and the M.Sc and PhD degrees in Electrical and Computers Engineering (Telecommunications), respectively in 1999 and 2005, both from Instituto Superior Técnico, University of Lisbon. Between 2005 and 2008 he was with NATO in Brussels (Belgium), where he managed the deployable communications of the new Air Command and Control System Program. He has been involved in multiple networking and telecommunications projects. He is the author of five books published by CRC Press and of several dozens of journal and conference papers, a member of IEEE and AFCEA, and reviewer for a number of international scientific IEEE journals and conferences. Finally, he has chaired many conference sessions and has been serving in the organizing committee of relevant EURASIP and IEEE conferences.
The evolution from 4G to 5G wireless systems is driven by the expected huge growth in user bit rates and overall system throughput. This requires a substantial spectral efficiency increase, while maintaining or even improving power efficiency. To accomplish this, one needs new transmission techniques, with the most promising ones being millimeter Waves (mm-Waves) and massive Multiple-Input Multiple-Output (m-MIMO). Moreover, the small wavelength means small antennas, allowing small-sized transmitter and receivers with very high number antenna elements and, therefore, enabling m-MIMO implementations. However, these frequencies present considerable challenges both in terms of propagation (high propagation free-space path losses, small diffraction effects and almost total absorption losses due to obstacles) and implementation difficulties, both at the analog and digital domains design, efficient amplification, signal processing requirements for equalization and user separation, etc.), which can be particularly challenging for m-MIMO systems. It is considered the use of m-MIMO combined with single-carrier with frequency-domain equalisation (SC-FDE) modulations, which aims to reduce the Peak-to-Average Power Ratio, as compared to other block transmission techniques (e.g., OFDM). A low-complexity iterative
frequency-domain receiver based on the equal gain combining approach is proposed. This receiver does not require matrix inversions and has excellent performance, which can be very close to the matched filter bound after just a few iterations, even when the number of receive antennas is not very high.
Aarhus University, Denmark
Rune Hylsberg Jacobsen holds a MSc (1995) degree in physics and chemistry and a PhD degree (1997) in optoelectronics from Aarhus University, Denmark. He has been an associate professor at the Electronics and Computer Engineering section at Aarhus University since 2010. His professional career furthermore includes 15 years in the telecommunication industry, where he has managed research & development products and teams. His main research interests embrace wireless networking, network security, smart grid communication, and distributed computing for the Internet of Things. He is the author of more than 50 scientific papers and has been speaker at several conferences.
Internet of Things (IoT) and cloud computing technology are advancing digitization and efficiency across application domains. Sensor technology provides insights into system performance and other observables. Sensor devices range from simple electronics to highly sophisticated multi-spectral cameras capable of measuring earth surface reflectance in different spectral bands. Such remote sensors can be carried by drones or satellites to produce streams of geospatial data. A satellite such as Sentinel-2, with revisit times of 5-10 days, delivers open data that surpass gigabytes volumes per week for a region as Denmark. Open satellite data is provided in downloadable bulks and calls for efficient processing with the aim of gaining new knowledge. In this talk, I present results from a feasibility study on the use of elastic cloud computing that efficiently provides data analytics based on bulks of satellite data from multi-spectral camera sensors. The performance of the processing cloud infrastructure is analyzed with the aim of determining a cost optimal elastic cloud computing infrastructure. The infrastructure is based on Open Geospatial Consortium (OGC) standards and OpenSensorHub is used to provide a dynamic sensor web application from the combination of geospatial maps and time series data. Furthermore, I present an efficient caching mechanism that minimizes the need for data downloads. The cloud infrastructure allows us to prepare knowledge gained from data and present this to stakeholders in a cost-efficient manner. Results from a case study in precision agriculture demonstrates several conceivable applications of calculated vegetation indices such as field classification and harvesting detection.
Chief Data Scientist, IBM Watson IoT, IBM Academy of Technology, Switzerland
Romeo Kienzler holds a Master Degree in Information Systems with specialisation in Applied Statistics and Bioinformatics from the Swiss Federal Institute of Technology. He works for IBM Watson IoT as a Chief Data Scientist. His current research focus is on large-scale machine learning on resilient cloud infrastructures in addition to the application of DeepLearning technologies on time series data. Romeo Kienzler is a Member of the IBM Academy of Technology, the IBM Technical Expert Council and the IBM BigData BlackBelts team.
In this hands-on tutorial session you will learn how to connect various IoT devices (exemplified using a Raspberry Pi) to a MQTT broker, ingest data in real-time in a NoSQL database (for historic data processing), train a machine learning model (e.g. Neural Net, ..) on this historic data and then - using this model - react on live-data using this model. Finally the result will be translated into an action and sent back to the IoT device.
Senior Lecturer, WSN Consults Ltd, United Kingdom
Dr. Celestine Iwendi is a Senior Lecturer Federal University Ikwo, Nigeria, Sensor researcher and Director of WSN Consults Ltd, a technology consulting company specializing in sensors, integrated circuits, security of wireless sensor networks module and system manufacturing with headquarters in Aberdeen, Scotland. He obtained a BSc and MSc in Electronics and Computer Engineering from Nnamdi Azikiwe University Nigeria, MSc Communication Hardware and Microsystems from Uppsala University Sweden and a PhD in Engineering at the University of Aberdeen, Scotland. He has carried out many Independent and supervised designs that apply knowledge of Wireless Sensor Networks, Signal processing and Communications engineering to analyze and solve problems at Nnamdi Azikiwe University, Awka Nigeria, and Nigerian Telecommunication (Nitel), Uppsala University Sweden, Norwegian University of Science and Technology, and University of Aberdeen, Scotland. He is member of the IEEE (Institute of electrical and electronics engineers), IEEE Communication Society, Swedish Engineers, Nigerian Society of Engineers. He is also an Associate at Centre for Sustainable International Development.
The Sub-Saharan Africa according to Commonwealth Telecommunications Organisation has succeeded in the last decade in bringing voice services within the reach of some three quarters of the population, but the vast majority of the region is falling further behind the rest of the world in terms of broadband connectivity that will empower Internet of Things. There are two main reasons for this: supply is limited, and prices have been very high. Broadband, which is the delivery of Internet IP bandwidth and all of the content, services and applications which consume this bandwidth. The essential underpinning of broadband therefore is the need for a high capacity transmission backbone network capable of delivering this bandwidth. Providing an entry level 256 Kbps broadband service to hundreds, thousands or millions of customers requires a backbone transmission network with sufficient capacity to do so. And each time an operator increases its broadband service from 256 Kbps to 512 Kbps, 2 Mbps, or even 100 Mbps, this in turn escalates the capacity requirements of the transmission backbone network. Broadband is not just a consequence of economic growth, it is also a cause. This is the problem currently facing most Internet service Providers in Africa.
Many people in Africa still use only landline Internet option, which is a “painfully slow” dial-up service. Cell phone service is erratic because of the thick trees. African consumers need satellite Internet to do their online banking, emailing, bill paying, social media for example Facebook , and general Internet surfing. Many will also like to watch television shows online and occasionally downloads files for research irrespective of where you are in any part of the country.
University Putra Malaysia, Malaysia
Zuriati Ahmad Zukarnain is an associate professor at the Faculty of Computer Science and information Technology, University Putra Malaysia. She is the head for high performance computing section at Institute for Mathematics and Research (INSPEM), University Putra Malaysia. She received her PhD from the University of Bradford, UK. Her research interests include: Efficient multiparty QKD protocol for classical network and cloud, load balancing in the wireless ad hoc network, quantum processor unit for quantum computer, Authentication Time of IEEE 802.15.4 with Multiple-key Protocol, Intra-domain Mobility Handling Scheme for Wireless Networks, Efficiency and Fairness for new AIMD Algorithms and A Kernel model to improve the computation speedup and workload performance. Se has published more than 100 papers in reputed journals and has been actively involved as a member of the editorial board for some international peer-reviewed and cited journals. She is currently undertaking some national funded projects on QKD protocol for cloud environment as well as routing and load balancing in the wireless ad hoc network.
For decades now, researchers have been trying to figure out how we can use the enormous potential of quantum mechanics to build a whole new generation of computers. According to Microsoft's research lab, we could crack the quantum computing code within the next 10 years. Fortunately, both Google and Microsoft are extremely invested in the idea of quantum computers. They believe this quantum technology will be vast demand in near future. Quantum cryptography sets goal towards the holy grail of absolute security in the cryptography domain. However, lack of efficient simulation based on the performance analysis tools may cause delay in achieving the goal. On the other hand, the real quantum experiment and quantum communication require expensive components and inefficient. Hence, a powerful performance analysis tool with ease to handle will give benefit to all the researchers in the area of quantum communication.
We are proposing an efficient simulation tool called as Quantum Communication Simulator. Quantum Communication Simulator is a tailored simulation tool for quantum communication experiment that may give benefit to the theorist, experimental, hardware developers and also the end user. The Quantum Communication Simulator should aim for both the performance oriented and also the cost oriented. The Quantum Communication Simulator is based on drag and drop interfaces with a complement functions such as budget estimation, cost planning, online collaboration and inbuilt which align with quantum communication experimental models. Further, estimation of cost is also included to assist budgeting and decision making process. Modelling and performance analysis or testing the real quantum experiment is expensive due to the nature of optical components. In order to overcome this problem, a Quantum Communication Simulator is needed to model and simulate the real Quantum communication experiments. The motivation for this particular Quantum Communication Simulator is the culmination of lead researcher’s respective fruitful researchers from digital security, high-speed network and quantum computation. Therefore, a Quantum Communication Simulator is needed to model and simulate the real quantum experiment and quantum communication.