Please wait a minute...
, No 3

■ Cover page(PDF 3151 KB) ■  Table of Content, March 2022  (PDF 34 KB)

  
  • Empirical Research for Self-Admitted Technical Debt Detection in Blockchain Software Projects
    Yubin Qu, W. Eric Wong, and Dongcheng Li
    2022, 18(3): 149-157.  doi:10.23940/ijpe.22.03.p1.149157
    Abstract    PDF (300KB)   
    References | Related Articles
    Blockchain technology has been used in various fields including digital currencies, distributed storage, and more. The issue of SATD detection for open source blockchain software systems has not been studied. We provide an in-depth analysis of the code comments of open blockchain source software projects. A pre-trained model based on cost-sensitivity is proposed, which is compared with the baseline model on several evaluation metrics. The results were statistically analyzed. The results show that the pre-trained model based on natural language understanding is able to achieve better classification performance. Our method improves 102% over the baseline method in the F1 metric.
    Performance Improvement Scheme of Blockchain Consensus for Supply Chain
    Zhihong Liang, Na Du, Yuxiang Huang, Kai Liu, and Zhichang Guo
    2022, 18(3): 158-166.  doi:10.23940/ijpe.22.03.p2.158166
    Abstract    PDF (363KB)   
    References | Related Articles
    At present, in the process of supply chain application of blockchain, there are performance problems dominated by Byzantine nodes. Aiming at the problems of the alliance chain based on Practical Byzantine Fault Tolerance (PBFT), a scheme combining Support Vector Machine and PBFT, namely S-PBFT, is proposed. First, SVM is used to evaluate the credit of consensus nodes. Priority is given to nodes with high creditworthiness to undertake the work of the primary node, so as to solve the consensus efficiency problem caused by view switching. Secondly, use smart contracts to ensure the effective and controllable execution of the whole process, so as to improve the authenticity and reliability of the operation results. Finally, testing with Hyperledger Fabric verified that S-PBFT can improve the throughput and latency of the blockchain-based supply chain to a certain extent.
    Two-Level Assessment Method for Electrical Fire Risk of High-Rise Buildings based on Interval TOPSIS Method
    Lei Su, Fan Yang, Yu Shen, and Zhichun Yang
    2022, 18(3): 167-175.  doi:10.23940/ijpe.22.03.p3.167175
    Abstract    PDF (320KB)   
    References | Related Articles
    The electrical fire risk of high-rise buildings remains high, which has threatened people's lives and property safety for a long time. Therefore, accurately evaluating the electrical fire risk and identifying the risk factors is vital for formulating proper electrical fire prevention measures. This paper proposes a two-level assessment method of electrical fire risk of high-rise buildings based on interval TOPSIS method. Firstly, the electrical fire data and expert evaluation opinions are queried, and the improved Delphi method is used to obtain the two-level index framework of electrical fire risk assessment. Then, the interval analytic hierarchy process is used to determine the weight coefficient of electrical fire risk index of high-rise buildings. Finally, the risk index factors are sorted in combination with interval TOPSIS method. After verification, it is found that the sequence of electrical fire risk factors of high-rise buildings is the chain reaction of inflammables and explosives (accounting for 27.9%), the fire caused by electrical lines (accounting for 22.07%), and the fire caused by electrical equipment (accounting for 20.89%), where the overload risk factor accounts for the largest proportion in the sub-indicators.
    Analysis of Data Handling Challenges in Edge Computing
    Sukruta Pardeshi, chetana Khairnar, and Khalid Alfatmi
    2022, 18(3): 176-187.  doi:10.23940/ijpe.22.03.p4.176187
    Abstract    PDF (658KB)   
    References | Related Articles
    Traditional Cloud Computing networks are intensely centralized in which the data is collected at the edges and transmitted back to the central network servers for computation. Due to the dramatic increase of IoT devices, such edges lack the computational power to handle the data collection and storage over the network because of the assumption of devices located closer to the edges. Edge Computing (EC) broadens the cloud computing characteristics of gathering, storing, processing, and analyzing a massive amount of data by locating services close to the edge of the network. Yet, the unique features of Edge Computing have introduced several challenging issues in the data handling process. The paper provides an overview of the data handling challenges faced in the Edge Computing network. It defines the fundamentals of Edge Computing - the basic architecture, how it's different from Cloud Computing, its applications, and discusses the threats encountered in Edge Computing. There are various challenges experienced in EC while storing, managing, and analyzing data over the network through different local Edge Nodes. This paper summarizes the solutions to the proposed problems in EC through different machine learning and deep learning algorithms. It also provides future research directions in edge computing.
    TUMKFCM-ELM: An Unsupervised Multiple Kernelized Fuzzy C-Means Extreme Learning Machine Approach for Heterogeneous Datasets
    Ankit R. Mune and Sohel A. Bhura
    2022, 18(3): 188-200.  doi:10.23940/ijpe.22.03.p5.188200
    Abstract    PDF (953KB)   
    References | Related Articles
    Heterogeneity is one of the critical aspects of big data that results in data integration challenges that big data analysis. Heterogeneous data types are also necessary for preprocessing to be unified. The heterogeneity of the benchmark data is summed up when, along with their sampling rate and storage policy, the data type can be indicated. Currently, Kernelized Fuzzy C-Means clustering methodology gained favor in the researching area where numerous functions generated by the kernel are employed in a similarity measure rather than a Euclidean distance, utilized in the traditional Fuzzy C-Means clustering methodology. This methodology also has inconsistencies in the effectiveness like the conventional Fuzzy C-Means clustering methodology because the initial cluster centers are created herein, too, based on the randomized user-defined membership values of objects. This current study presents a modified strategy for eliminating and improving the overall efficiency of the random selection of the Kernelized Fuzzy C-Means clustering approach. This work aims to implement Three-Phase Unsupervised Multiple Kernels Fuzzy C-Means Extreme Learning Machine (TUMKFCM-ELM) approach. Here, we have done work in Three-Phases of this approach: Data Preprocessing (1st phase) and Unsupervised Multiple Kernels Fuzzy C-Means (2nd phase) clustering technique to determine the centroids and membership matrix followed by data preprocessing. These centroids and membership values are updated until the stop criterion is met and obtain the final clusters. At last, ELM has been applied as the 3rd phase of this proposed method to achieve optimal coefficient. Multiple heterogeneous datasets have been collected from numerous sources for this simulation and show an Explanatory data analysis and cluster distributions. We have compared the proposed approach with the previous TUMK-ELM methodology using Accuracy, NMI and purity three validity metrics. This work visualizes the results after the clustering performance and comparable performance in terms of effectiveness. It also provides results in terms of time cost.
    Investigation for Performance Measures of Wireless Power Transfer (WPT) using MATLAB
    Vivek Mishra and Vibhuti
    2022, 18(3): 201-212.  doi:10.23940/ijpe.22.03.p6.201212
    Abstract    PDF (765KB)   
    References | Related Articles
    Wireless power transfer (WPT) is being used in consumer electronics, medical implants, and electric vehicles. Due to this, the WPT system has received a lot of attention in recent years. WPT is a promising option in situations where physical connectors are unreliable and prone to failure. With increasing coupling-distance, the efficiency of the WPT system rapidly decreases. To boost performance of the WPT system, a variety of topologies have been used. The advanced design system MATLAB software is used to perform transient analysis to understand the behaviour of voltage waveforms at transmitter and receiver end. WPT with respect to distance has been measured using S-parameters. The proposed model is tested with a peak-to-peak amplitude of 400 V at frequencies ranging from 10 to 90 kHz. The overall result shows that 52.75 kilohertz has the maximum efficiency.
    Modeling of Pedestrians Travelling Behaviour (PTB) on Urban Environment for COVID-19 Pandemic using MLR Algorithm Analysis
    Badveeti Adinarayana and badweeti Kasinayana
    2022, 18(3): 213-221.  doi:10.23940/ijpe.22.03.p7.213221
    Abstract    PDF (771KB)   
    References | Related Articles
    In India, the state of Andhra Pradesh (A.P) is rapidly developing among all Indian states, and the environment and economy have suffered as a result of the COVID-19 attack in the year 2019. In the COVID-19 pandemic situations for urban environments, pedestrians travel with and without masks and without maintaining social distance. The objective of this research is to discover the impact of COVID-19 on pedestrian traveling behavior model using descriptive statistics analysis (DSA). To achieve these goals, pedestrian safety metrics and pedestrian model variables are assessed. The DSA methodology used in this study was chosen based on the study locations discussed in this research. The data was gathered through field observations in different areas of A.P, such as vegetable and fruit markets, temples, and travel destinations in various urban districts. Pedestrian safety metrics and pedestrian variables can be the total samples tested (TST) in number, confirmed cases (CC) in number, active cases (AC) in number, cured/discharged cases (CDC) in number, deceased cases (DC) in number, negative cases (NC) in number and deaths (DN) in numbers. Calibration and validation of the COVID-19 pedestrian sample of the model data is evaluated. The model results presented that pedestrian in the 18-39 age groups were the most affected by COVID-19, and the highest death rates were reported in the 50-64 and 65-74 age groups. The sequence of the further study will be continued.
    Deep Learning Model for Black Spot Classification
    Geetanjali S. Mahamunkar, Arvind W. Kiwelekar, and Laxman D. Netak
    2022, 18(3): 222-230.  doi:10.23940/ijpe.22.03.p8.222230
    Abstract    PDF (436KB)   
    References | Related Articles
    Black spots are accident spots where more than five accidents or more than ten fatalities have occurred in the past three years. The manual classification of accident spots into black spots by analyzing past data is a tedious task for traffic police. This paper proposes a dataset of accidents during 2019, 2020, and 2021 in the Raigad District of Maharashtra, India. It also classifies the spot based on the criteria for identifying an accident spot as a black spot using machine learning techniques such as Logistic Regression, Linear Discriminant Analysis, Gaussian Naïve Bayes, Support Vector Machine, and Multi-Layer Perceptron. We compare the performances of these algorithms in terms of various statistical parameters. Thus, the proposed model automates the manual task of classifying an accident spot into a black spot. Also, as the dataset includes the geocoordinates of the accident spots, researchers can use the dataset for other geospatial data analysis tasks.
Online ISSN 2993-8341
Print ISSN 0973-1318