Repository logo
Communities & Collections
All of DSpace
  • English
  • العربية
  • বাংলা
  • Català
  • Čeština
  • Deutsch
  • Ελληνικά
  • Español
  • Suomi
  • Français
  • Gàidhlig
  • हिंदी
  • Magyar
  • Italiano
  • Қазақ
  • Latviešu
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Српски
  • Svenska
  • Türkçe
  • Yкраї́нська
  • Tiếng Việt
Log In
New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Ojo, A. K."

Filter results by typing the first few letters
Now showing 1 - 20 of 36
  • Results Per Page
  • Sort Options
  • Thumbnail Image
    Item
    A comparison of the predictive capabilities of artificial neural networks and regression models for knowledge discovery
    (2013) Ojo, A. K.; Adeyemo, A. B.
    In this paper, Artificial Neural Networks (ANN) and Regression Analysis models were considered to determine which of them performs better. Prediction was done using one hidden layer and three processing elements in the ANN model. Furthermore, prediction was done using regression analysis. The parameters of regression model were estimated using Least Square method. To determine the better prediction, mean square errors (MSE) attached to ANN and regression models were used. Seven real series were fitted and predicted with in both models. It was found out that the mean square error attached to ANN model was smaller than regression model which made ANN a better model in prediction.
  • Thumbnail Image
    Item
    A mobile students’ industrial work experience scheme logbook application
    (Science and Education Publishing, 2020) Olojakpoke, D. M.; Ojo, A. K.
    Monitoring of students who are undergoing the Students’ Industrial Work Experience Scheme (SIWES) program by school-based supervisors is a difficult task because the current paper based logbook system currently employed is not adequate enough to determine how well students are undergoing the program. It is difficult for school-based supervisors to know whether students actually filled their logbooks daily, showing what they have done or whether they filled it all at the end of a long period of time which means that such entries are very likely to be fraudulent. Which is why school-supervisors try to visit students on the program to physically monitor such students, however due to distance and other logistical issues school-based supervisors are only able to visit such students once or at most twice or sometimes never. The application was developed following the incremental model. Node.Js was used for the backend, MongoDB was used as the database while React Native was used to create the front-end. This application helps school-based supervisors monitor students on the SIWES program more effectively and also makes grading and commenting on logbook entries a lot easier. It can therefore be deployed to tertiary institutions in Nigeria to assist them in the running of their respective SIWES programmes.
  • Thumbnail Image
    Item
    A model for conflicts’ prediction using deep neural network
    (2021-10) Olaide, O. B.; Ojo, A. K.
    Conflict is part of human social interaction, which may occur from a mere misunderstanding among groups of settlers. In recent times, advanced Machine Learning (ML) techniques have been applied to conflict prediction. Strategic frameworks for improving ML settings in conflict research are emerging and are being tested with new algorithm-based approaches. These developments have given rise to the need to develop a Deep Neural Network model that predicts conflicts. Hence, in this study, two Artificial Neural Network models were developed, the dataset which was extracted from https://www.data.worlduploaded by the Armed Conflict Location and Event Data Project (ACLED), in four separate CSV files (January 2015 to December 2018). The dataset for the year 2015 has 2697 instances and 28 features, for 2016 was 2233 with the same feature, for 2017 has 2669 instances with the same features, and 2018 has 1651 instances. After the development of the models: the baseline Artificial Neural Network achieved an accuracy of 95% and a loss of 5% on the training data and an accuracy of 90% and 10% loss on the test set. The Deep Neural Network Model achieved 98% accuracy and 2% loss on the training set, with 89% accuracy and 11% loss on the test set. It was concluded that to further improve the prediction of conflict, there is a need to address the issue of the dataset, in developing a better and more robust model.
  • Thumbnail Image
    Item
    A predicting phishing websites using support vector machine and multi-class classification based on association rule techniques
    (2018-06) Woods, N. C.; Agada, V. E.; Ojo, A. K.
    Phishing is a semantic attack which targets the user rather than the computer. It is a new Internet crime in comparison with other forms such as virus and hacking. Considering the damage phishing websites has caused to various economies by collapsing organizations, stealing information and financial diversion, various researchers have embarked on different ways of detecting phishing websites but there has been no agreement about the best algorithm to be used for prediction. This study is interested in integrating the strengths of two algorithms, Support Vector Machines (SVM) and Multi-Class Classification Rules based on Association Rules (MCAR) to establish a strong and better means of predicting phishing websites. A total of 11,056 websites were used from both PhishTank and yahoo directory to verify the effectiveness of this approach. Feature extraction and rules generation were done by the MCAR technique; classification and prediction were done by SVM technique. The result showed that the technique achieved 98.30% classification accuracy with a computation time of 2205.33s with minimum error rate. It showed a total of 98% Area under the Curve (AUC) which showed the proportion of accuracy in classifying phishing websites. The model showed 82.84% variance in the prediction of phishing websites based on the coefficient of determination. The use of two techniques together in detecting phishing websites produced a more accurate result as it combined the strength of both techniques respectively. This research work centralized on this advantage by building a hybrid of two techniques to help produce a more accurate result.
  • Thumbnail Image
    Item
    A predicting phishing websites using support vector machine and multi-class classification based on association rule techniques
    (2018-06) Woods, N. C.; Agada, V. E.; Ojo, A. K.
    Phishing is a semantic attack which targets the user rather than the computer. It is a new Internet crime in comparison with other forms such as virus and hacking. Considering the damage phishing websites has caused to various economies by collapsing organizations, stealing information and financial diversion, various researchers have embarked on different ways of detecting phishing websites but there has been no agreement about the best algorithm to be used for prediction. This study is interested in integrating the strengths of two algorithms, Support Vector Machines (SVM) and Multi-Class Classification Rules based on Association Rules (MCAR) to establish a strong and better means of predicting phishing websites. A total of 11,056 websites were used from both PhishTank and yahoo directory to verify the effectiveness of this approach. Feature extraction and rules generation were done by the MCAR technique; classification and prediction were done by SVM technique. The result showed that the technique achieved 98.30% classification accuracy with a computation time of 2205.33s with minimum error rate. It showed a total of 98% Area under the Curve (AUC) which showed the proportion of accuracy in classifying phishing websites. The model showed 82.84% variance in the prediction of phishing websites based on the coefficient of determination. The use of two techniques together in detecting phishing websites produced a more accurate result as it combined the strength of both techniques respectively. This research work centralized on this advantage by building a hybrid of two techniques to help produce a more accurate result.
  • Thumbnail Image
    Item
    Ako, A.
    (2019-09) Ojo, A. K.
    This study presents an approach to extracting data from amazon dataset and performing some preprocessing on it by combining the techniques of Bi-Directional Long Short-Term Memory and 1-Dimensional Convolution Neural Network to classify the opinions into targets. After parsing the dataset and identifying desired information, we did some data gathering and preprocessing tasks. The feature selection technique was developed to extract structural features which refer to the content of the review (Parts of Speech Tagging) along with extraction of behavioral features which refer to the meta-data of the review. Both behavioral and structural features of reviews and their targets were extracted. Based on extracted features, a vector was created for each entity which consists of those features. In evaluation phase, these feature vectors were used as inputs of classifier to identify whether they were fake or non-fake entities. It could be seen that the proposed solution has over 90% of the predictions when compared with other work which had 77%. This increase was as a result of the combination of the bidirectional long short-term memory and the convolutional neural network algorithms.
  • Thumbnail Image
    Item
    An algorithmic framework for hybrid adaptive protocol (HAP) to manage broadcast storm problems in mobile ad-hoc networks (MANETS)
    (2008-09) Onifade, O. F. W.; Ojo, A. K.; Okiyi, K. U.
    The consequences of pure flooding which is amongst the simplest and the most straight forward approach to performing broadcast include redundant broadcast, contention and collision which are collectively referred to as broadcast storm problem (BSP). This is as a result of the use of plain broadcasting approaches leading to signal overlap in a geographical area with wireless communication. The Counter-based scheme was developed to reduce Broadcast Storm problem. However, to be able to maintain high delivery ratio in either a sparse or dense networks, different thresholds are required. Because of the nature of MANETs determining this threshold require a level of dynamism, without which its operation will be marred. This research work thus proposed an algorithmic framework to address the BSP problem, using the knowledge of it neighbourhood density to dynamically determine the threshold so as to adapt to both dense and sparse network while limiting the above stated constrains.
  • Thumbnail Image
    Item
    An electronic shopping system with a recommendation agent
    (2009) Ojo, A. K.; Emuoyibofarhe, O. J.; Emuoyibofarhe, O. N.; Lala, O. G.; Chukwuemeka, C. U.
    There is an inevitable need to improve the operation portfolio of the boutique, and erase problems like time consumption, inconsistency and a host of other problems encountered by most business enterprises. This research study focused on the design of a web based shopping system. The reason for the development of this system is because every shopping software system is precipitated by some business need which are: the need to correct a defect in an existing application, the need to adapt a legacy system to a changing business environment, the need to extend the functions and features of an existing application or the need to create a new product, service or system. A feasibility study was carried out through interviewing an entrepreneur (business proprietor) in order to acquire knowledge about the mode of operation of the boutique; also specialists in the field of fashion designing were interviewed to acquire knowledge that will be used by the proposed software agent to give recommendations online. The existing system was studied and deficiencies such as long queues, customer dissatisfaction and staff impatience, as well as the need for customers to get professional guidance. The Scripting language used for developing the database is MYSQL, and the application used in developing the database for this site is an SQL application called SQLYOG and it is compactable with MYSQL Server which is either wamp, xammp or zends. The system accepts input from the user whether an administrator or a customer, processes the input i.e. (carries out the required action on the input collected as specified by the system design) and produces an output (either a completed transaction report and receipt or an outfit recommendation). Interfaces were designed using PHP on Dreamweaver platform. MySQL Query Language on SQLYOG platform was used as a database tool to develop, organize and store all vital details about customers, suppliers, sales, product, and product categories. The proposed system is designed in a bid to improve speed, accuracy, storage capability, customer satisfaction, job flexibility for the staff as well as shopping flexibility for the customer and consistency in the boutique; it can be used by trained personnel as well as for general public due to its simplicity. This work elaborates on the implementation and use of software agents in global transaction i.e. people can transact from the various locations and their goods are delivered at their doorstep enabling them to save time and the stress involved in physically doing the shopping.
  • Thumbnail Image
    Item
    Angular displacement scheme (ADS): providing reliable geocast transmission for mobile ad-hoc networks (MANETs)
    (2008-08) Onifade, O. F. W.; Ojo, A. K.; Akande, O. O.
    In wireless ad hoc environments, two approaches can be used for multicasting: multicast flooding or multicast tree-based approach. Existing multicast protocols mainly based on the latter approach, may not work properly in mobile ad hoc networks as dynamic movement of group members can cause the frequent tree reconfiguration with excessive channel overhead and resulting into loss of datagram. Since the task of keeping the tree structure up-to-date in the multicast tree-based approach is nontrivial, sometimes, multicast flooding is considered as an alternative approach for multicasting in MANET. The scheme presented in this research attempts to reduce the forwarding space for multicast packets beyond earlier presented scheme and also examine the effect of our improvements upon control packet overhead, data packet delivery ratio, and end-to-end delay by further reduction in the number of nodes that rebroadcasts multicast packets while still maintaining a high degree of accuracy of delivered packets. The simulated result was carried out with OMNeT++ to present the comparative analysis on the performance of angular scheme with flooding and LAR box scheme. Our result showed a better improvement compared to flooding and LAR box schemes.
  • Thumbnail Image
    Item
    Characterisation of academic journal publications using text mining techniques
    (Science and Education Publishing, 2017) Ojo, A. K.; Adeyemo, A. B.
    The ever-growing volume of published academic journals and the implicit knowledge that can be derived from them has not fully enhanced knowledge development but rather resulted into information and cognitive overload. However, publication data are textual, unstructured and anomalous. Analysing such high dimensional data manually is time consuming and this has limited the ability to make projections and trends derivable from the patterns hidden in various publications. This study was designed to develop and use intelligent text mining techniques to characterise academic journal publications. Journals Scoring Criteria by nineteen rankers from 2001 to 2013 of 50th edition of Journal Quality List (JQL) were used as criteria for selecting the highly rated journals. The text-miner software developed was used to crawl and download the abstracts of papers and their bibliometric information from the articles selected from these journal articles. The datasets were transformed into structured data and cleaned using filtering and stemming algorithms. Thereafter, the data were grouped into series of word features based on bag of words document representation. The highly rated journals were clustered using Self-Organising Maps (SOM) method with attribute weights in each cluster.
  • Thumbnail Image
    Item
    Ensuring QoS with adaptive frame rate and feedback control mechanism in video streaming
    (2012-12) Onifade, O. F. W.; Ojo, A. K.
    Video over best-effort packet networks is cumbered by a number of factors including unknown and time- varying bandwidth, delay and losses, as well as many additional issues such as how to fairly share the network resources amongst many flows and how to efficiently perform one-to-many communication for popular content. This research investigates video streaming formats, encoding and compression techniques towards the development and simulation of a rate adaptation model to reduce packet loss. The thrust of this research aimed at enriching and enhancing the quality of video streaming over the wireless network. We developed both mathematical models which were thereafter simulated to depict the need for advancing the existing solution for packet scheduling towards recovery from packet loss and error handling in video streaming.
  • Thumbnail Image
    Item
    Forecasting Nigerian equity stock returns using long short-term memory technique
    (2024) Ojo, A. K.; Okafor, I. J.
    Investors and stock market analysts face major challenges in predicting stock returns and making wise investment decisions. The predictability of equity stock returns can boost investor confidence, but it remains a difficult task. To address this issue, a study was conducted using a Long Short-term Memory (LSTM) model to predict future stock market movements. The study used a historical dataset from the Nigerian Stock Exchange (NSE), which was cleaned and normalized to design the LSTM model. The model was evaluated using performance metrics and compared with other deep learning models like Artificial and Convolutional Neural Networks (CNN). The experimental results showed that the LSTM model can predict future stock market prices and returns with over 90% accuracy when trained with a reliable dataset. The study concludes that LSTM models can be useful in predicting financial time-series-related problems if well-trained. Future studies should explore combining LSTM models with other deep learning techniques like CNN to create hybrid models that mitigate the risks associated with relying on a single model for future equity stock predictions.
  • Thumbnail Image
    Item
    Improved model for detecting fake profiles in online social network: a case study of twitter
    (2019) Ojo, A. K.
    Online Social Network (OSN) is like a virtual community where people build social networks and relations with one another. The open access to the Internet has increased the growth of OSN which has attracted intruders to exploit the weaknesses of the Internet and OSN to their own gain. The rise in the usage of OSN has posed security threats to OSN users as they share personal and sensitive information online which could be exploited by these intruders by creating profiles to carry out a series of malicious activities on the social network. In fact, it is no gain saying that the intent of creating fake accounts has adverse effect and the Internet has made it quite easy to concede one’s identity; and this makes it difficult to detect fake accounts as they try to imitate real accounts. In this study, a model that can accurately identify fake profiles in OSN which uses Natural Language Processing Technique to eliminate or reduce the size of the dataset thereby improving the overall performance of the model was proposed. Principal Component Analysis was used for appropriate feature selection. After extraction, six attributes/features that influenced the classifier were found. Support Vector Machine (SVM), Naïve Bayes and Improved Support Vector Machine (ISVM) were used as Classifiers. ISVM introduced a penalty parameter to the standard SVM objective function to reduce the inequality constraints between the slack variables. This gave a better result of 90% than the SVM and Naïve Bayes which gave 77.4% and 77.3% respectively.
  • Thumbnail Image
    Item
    Improved model for facial expression classification for fear and sadness using local binary pattern histogram
    (2020) Ojo, A. K.; Idowu, T. O.
    In this study, a Local Binary Pattern Histogram model was proposed for Facial expression classification for fear and sadness. There have been a number of supervised machine models developed and used for facial recognition in past researches. The classifier requires human effort to perform feature extraction which has led to unknown changes in the expression of human face and incomplete feature extraction and low accuracy. This study proposed a model for improving the accuracies for fear and sadness and to extract features to distinguish between fear and sadness. Images of different people of varying ages were extracted from two datasets got from Japanese female facial expression (jaffe) dataset and Cohn cade got from Kaggle. In other to achieve an incremental development, classification was done using Linear Support Vector Machine (LSVM) and Random Forest Classifier (RFC). The accuracy rates for the LSVM models, LSVM1 and LSVM2 were 88% and 87% respectively while the RFC models, RFC1 and RFC2, were 81% and 82% respectively.
  • Thumbnail Image
    Item
    Improved privacy Protection model for prevention of data over-collection in smart devices
    (2022-12) Oketayo, A. M.; Ojo, A. K.
    In this study, an attempt was made using machine learning algorithm with the user data store in the mobile cloud framework to solve the problem of data over-collection. This was achieved by designing a model using the security risk level of the applications and the corresponding class level of the users on the smartphone that will help in preventing smartphone apps from accessing and collecting users’ private data while still within the permission scope. Users can store information in the cloud environment where the huge numbers of users are involved. We develop a mobile agent simulator to generate data, and determine the security risk level of the apps on users’ data with the class level of the data. The permission model was designed to determine whether the app is granted permission to access user’s data or not. The data was trained with the use of Neural Network. The evaluation metrics used were accuracy and comparison. For accuracy, the algorithm was compared with the existing algorithm. The data analysis showed that there was restriction for apps accessing the users’ data. The model if deployed on the smartphone will prevent apps from over collect users’ data even while still within the permission scope. This study proved that neural network with mobile cloud computing can be applied to prevent data over-collection in smart devices.
  • Thumbnail Image
    Item
    Improvement on emotional variance analysis technique (EVA) for sentiment analysis in healthcare service delivery
    (Foundation of Computer Science FCS, New York, USA, 2024-05) Agada, V. E.; Ojo, A. K.
    This research introduces an innovative approach to improving sentiment analysis in healthcare service delivery by integrating Emotion and Affect Recognition (EAR) techniques into Emotional Variance Analysis (EVA). Leveraging logistic regression, the modifications, including adjusting confidence thresholds and utilizing the Rectified Linear Unit (ReLU) function, aim to address high polarity and enable real-time analysis. The methodology outlines a systematic process for EAR integration, offering practical insights for healthcare practitioners. In this study, additional datasets, including the Healthcare Patient Satisfaction Data Collection, the 9 Popular Patient Portal App Reviews for November 2023, and the HCAHPS Hospital Ratings Survey, are incorporated to enhance the robustness and reliability of the approach. The results across three healthcare centers demonstrate the effectiveness of this augmented approach, with comparisons against existing models using performance metrics. While showcasing promising potential, further research is needed to explore scalability and generalizability.
  • Thumbnail Image
    Item
    Improving information acquisition via text mining for efficient e-governance
    (2015-03) Adeyemo, A. B.; Ojo, A. K.
    In this paper we proposed a framework for integrating text mining with E-Governance. We suggested that the users of electronic governance can use the text terms to describe their interest which can be processed for clustering and term extraction. The words thus expressed by users are tracked and subjected to processing wherein it is possible to generate content. We have provided the framework and tested it in a few web sites. We have used the clustering and pre-processing for the content management. The results are encouraging and it is possible to extent such exercises for other text minging processes.
  • Thumbnail Image
    Item
    Improving node reachability QoS during broadcast storm in Manets using neighbourhood density kowledge (NDK)
    (2008-12) Onifade, O. F. W.; Ojo, A. K.; Lala, O. G.
    The Counter-based scheme was developed to reduce Broadcast Storm problem. However, to be able to maintain high delivery ratio in either a sparse or dense networks, different thresholds are required. Because of the nature of MANETs determining this threshold require a level of dynamism, without which its operation will be marred. Our earlier research work proposed an algorithmic framework to address the BSP problem, using the knowledge of it neighbourhood density to dynamically determine the threshold so as to adapt to both dense and sparse network while limiting the above stated constrains. In this work, we present the simulation result of our attempt to improve reachability of nodes in MANETs using Neighbourhood Density Knowledge (NDK). While the major characteristics of MANETs remain indeterminate behaviours in the number of participating nodes, mobility and sporadic topology changes based on nodal movement, ability of any supporting protocol to function under both sparsely and densely population of nodes. With the Counter based threshold value based on the neighbourhood information, an important metric considered is the reacheability which is defined in terms of the ratio of nodes that received the broadcast message out of the entire node in the network. Overall, the NDK approach performs best on both sparser and dense networks.
  • Thumbnail Image
    Item
    K-nearest neighbors Bayesian approach to false news detection from text on social media
    (Modern Education and Computer Science Press, 2022-08) Ogunsuyi, O. J.; Ojo, A. K.
    Social media usage has increased due to the rate at which technologies are emerging and it is less likely to detect false news/information manually as it aims to capture the human mind. The spread of false news can cause havoc; therefore, detection of false news becomes paramount where almost everyone has access to social media. Our proposed system optimizes the false news detection process. The system combines advantages of two textual feature extraction methods and two machine learning algorithms for text classification. Basic pre-processing methods were employed. Feature extraction was carried out using Term Frequency-Inverse Document Frequency with Word2Vector. K-Nearest Neighbour (KNN) and Naïve Bayes (NB) algorithms are combined to give KNN Bayesian. The most available systems made use of a single feature extraction method but in our system, two feature extraction methods are combined. The evaluation metrics used were accuracy, precision, recall, f1score and KNN Bayesian performed better than KNN. To further evaluate our model, the Area under the Curve-Receiver Operator Characteristics (AUC-ROC) revealed that AUC of KNN Bayesian ROC curve is higher than that of KNN.
  • Thumbnail Image
    Item
    Knowledge discovery in academic electronic resources using text mining
    (2013-02) Ojo, A. K.; Adeyemo, A. B.
    Academic resources documents contain important knowledge and research results. They have highly quality information. However, they are lengthy and have much noisy results such that it takes a lot of human efforts to analyse. Text mining could be used to analyse these textual documents and extract useful information from large amount of documents quickly and automatically. In this paper, abstracts of electronic publications from African Journal of Computing and ICTs, an IEEE Nigerian Computer Chapter Publication were analysed using text mining techniques. A text mining model was developed and was used to analyse the abstracts collected. The texts were transformed into structured data in frequency form, cleaned up and the documents split into series of word features (adjectives, verbs, adverbs, nouns) and the necessary words were extracted from the documents. The corpus collected had 1637 words. The word features were then analysed by classifying and clustering them. The text mining model developed is capable of mining texts from academic electronic resources thereby identifying the weak and strong issues in those publications.
  • «
  • 1 (current)
  • 2
  • »

DSpace software copyright © 2002-2025 Customised by Abba and King Systems LLC

  • Privacy policy
  • End User Agreement
  • Send Feedback
Repository logo COAR Notify