Software Engineering and Information Systems
Narges Akhound; Sahar Adabi; Ali Rezaee; Amir masoud Rahmani
Articles in Press, Accepted Manuscript, Available Online from 27 September 2022
Abstract
The advent of the Internet of Things (IoT) technology has made it possible for different devices to be widely connected to the Internet and interact. It has led to the production of large amounts of heterogeneous data. On the other hand, cloud computing is a convenient and efficient processing model ...
Read More
The advent of the Internet of Things (IoT) technology has made it possible for different devices to be widely connected to the Internet and interact. It has led to the production of large amounts of heterogeneous data. On the other hand, cloud computing is a convenient and efficient processing model for storing and processing data. Still, the increasing demand for real-time and delay-sensitive applications is increasing day by day. Due to network bandwidth limitations, these problems cannot be solved using cloud computing alone. A fog layer located between the IoT devices and the cloud computing layer has been proposed to overcome the problem of resource constraints in mobile devices. delay-sensitive applications run that require more volume and power resources. In this paper, end-to-end architecture for integrating IoT, fog, and cloud layers into a large-scale dispatched application is proposed to support high availability to make efficient use of fog-cloud resources and achieve the appropriate quality of service (QoS) in terms of delay and failure probability criteria. The mentioned architecture consists of three hierarchal layers: IoT devices, fog nodes, and cloud data centers. Depending on the processing power of each layer's resources, user requests may be executed on the same layer or sent to a higher layer. Then, quality characteristics such as availability, performance, and interoperability for the proposed architecture are evaluated by the ATAM scenario-based method. The basis of architectural evaluation and analysis in this method is the study of the requirements and the quality characteristics of the system architecture.
Software Engineering and Information Systems
Amir Abbas Farahmand; Reza Radfar; Alireza Poorebrahimi; Mani Sharifi
Volume 7, Issue 2 , May 2021, , Pages 103-126
Abstract
IoT, a state-of-the-art technology, faces many challenges in its growth and development. One of the main concerns is the potential threats posed by the spread of such technology in the world. The widespread adoption and spread of such a technology can threaten us much more seriously than the Internet ...
Read More
IoT, a state-of-the-art technology, faces many challenges in its growth and development. One of the main concerns is the potential threats posed by the spread of such technology in the world. The widespread adoption and spread of such a technology can threaten us much more seriously than the Internet currently available. The challenges we face in adopting such technology will include both the social and the technical aspects. Technical limitations include security considerations, privacy, as well as the resource, energy, and capacity issues for such a large amount of data and processing. Besides, socially, cultural infrastructure must first be provided for the diffusion of such technologies among the community. This study aimed to investigate the factors affecting the readiness level of the acceptance of IoT technologies. The relationships are examined as six main categories identified, namely the social aspect, the cultural aspect, the human aspect, the technological aspect, the financial aspect, the management aspect, government laws, and regulations. The opinions of senior ICT executives nationwide were collected. The statistical population of this study consists of experts and users of the financial sector, stock exchange, and financial institutions. Since the statistical population is infinite, 384 randomly available individuals are selected. SMART.PLS was used to validate the model and test the relationships between variables. The results indicate the impact of the identified categories on IoT adoption readiness.
Software Engineering and Information Systems
Mohammad Reza Hassanzadeh; farshid keynia
Volume 7, Issue 1 , February 2021, , Pages 35-54
Abstract
Metaheuristic algorithms are typically population-based random search techniques. The general framework of a metaheuristic algorithm consisting of its main parts. The sections of a metaheuristic algorithm include setting algorithm parameters, population initialization, global search section, local search ...
Read More
Metaheuristic algorithms are typically population-based random search techniques. The general framework of a metaheuristic algorithm consisting of its main parts. The sections of a metaheuristic algorithm include setting algorithm parameters, population initialization, global search section, local search section, and checking the stopping conditions in a metaheuristic algorithm. In the parameters setting section, the user can monitor the performance of the metaheuristic algorithm and improve its performance according to the problem under consideration. In this study, an overview of the concepts, classifications, and different methods of population initialization in metaheuristic algorithms discussed in recent literature will be provided. Population initialization is a basic and common step between all metaheuristic algorithms. Therefore, in this study, an attempt has been made that the performance, methods, mechanisms, and categories of population initialization in metaheuristic algorithms. Also, the relationship between population initialization and other important parameters in performance and efficiency of metaheuristic algorithms such as search space size, population size, the maximum number of iteration, etc., which are mentioned and considered in the literature, are collected and presented in a regular format.
Software Engineering and Information Systems
Asieh Ghanbarpour; Hassan Naderi; Soheil ZareMotlagh
Volume 6, Issue 3 , August 2020, , Pages 169-186
Abstract
Abstract—Keyword Search is known as a user-friendly alternative for structured languages to retrieve information from graph-structured data. Efficient retrieving of relevant answers to a keyword query and effective ranking of these answers according to their relevance are two main challenges in ...
Read More
Abstract—Keyword Search is known as a user-friendly alternative for structured languages to retrieve information from graph-structured data. Efficient retrieving of relevant answers to a keyword query and effective ranking of these answers according to their relevance are two main challenges in the keyword search over graph-structured data. In this paper, a novel scoring function is proposed, which utilizes both the textual and structural features of answers in order to produce a more accurate order of answers. In addition, a query processing algorithm is developed based on information spreading technique to enumerate answers in approximate order. This algorithm is further improved by allowing a skewed development toward more promising paths and enables a more efficient processing of keyword queries. Performance evaluation through extensive experiments on a standard benchmark of three real-world datasets shows the effectiveness and efficiency of the proposed algorithms.Index Terms—Information retrieval, Database, Keyword search, Relevant answers, Information spreading.
Software Engineering and Information Systems
shahrzad Oveisi; Mohammad Nadjafi; Mohammad Ali Farsi; Ali moeini; Mahmood Shabankhah
Volume 6, Issue 3 , August 2020, , Pages 187-200
Abstract
One of the key pillars of any operating system is its proper software performance. Software failure can have dangerous effects and consequences and can lead to adverse and undesirable events in the design or use phases. The goal of this study is to identify and evaluate the most significant software ...
Read More
One of the key pillars of any operating system is its proper software performance. Software failure can have dangerous effects and consequences and can lead to adverse and undesirable events in the design or use phases. The goal of this study is to identify and evaluate the most significant software risks based on the FMEA indices with respect to reduce the risk level by means of experts’ opinions. To this end, TOPSIS as one of the most applicable methods of prioritizing and ordering the significance of events has been used. Since uncertainty in the data is inevitable, the entropy principle has been applied with the help of fuzzy theory to overcome this problem to weigh the specified indices.The applicability and effectiveness of the proposed approach is validated through a real case study risk analysis of an Air/Space software system. The results show that the proposed approach is valid and can provide valuable and effective information in assisting risk management decision making of our software system that is in the early stages of software life cycle. After obtaining the events and assessing their risk using the existing method, finally, suggestions are given to reduce the risk of the event with a higher risk rating.
Software Engineering and Information Systems
Majid Tajamolian; Mohammad Ghasemzadeh
Volume 5, Issue 4 , November 2019, , Pages 245-254
Abstract
Various numbering schemes are used to track different versions and revisions of files, software packages, and documents. One major challenge in this regard is the lack of an all-purpose, adaptive, comprehensive and efficient standard. To resolve the challenge, this article presents Quadruple Adaptive ...
Read More
Various numbering schemes are used to track different versions and revisions of files, software packages, and documents. One major challenge in this regard is the lack of an all-purpose, adaptive, comprehensive and efficient standard. To resolve the challenge, this article presents Quadruple Adaptive Version Numbering Scheme. In the proposed scheme, the version identifier consists of four integers. These four numbers from Left to Right are called: "Release Sequence Number", "Generation Number", "Features List Number", and "Corrections List Number" respectively. In the article, special values are given for the quadruple numbers and their meanings are described. QAVNS is an "Adaptive" scheme; this means that it has the capability to track the different versions and revisions of files, software packages, project output documents, design documents, rules, manuals, style sheets, drawings, graphics, administrative and legal documents, and the other types of "Informational Objects" in different environments, without alterations in its structure. The proposed scheme has the capability to monitor changes in the types of informational objects, such as virtual machine memory, in the live migration process. The experimental and analytical results indicate the desirability and effectiveness of the proposed scheme in satisfying the desired expectations. The proposed scheme can become a common standard and successfully applied in all academic, engineering, administrative, legislative, legal, manufacturing, industrial, operational, software development, documentary, and other environments. The standardization of this scheme and its widespread usage can be a great help in improving everyone's understanding of the numbering of versions & revisions.
Software Engineering and Information Systems
Saeid Khajehvand; Seyed Mahdi Abtahi
Volume 5, Issue 2 , May 2019, , Pages 81-92
Abstract
In this paper, chaotic dynamic and nonlinear control in a glucose-insulin system in types I diabetic patients and a healthy person have been investigated. Chaotic analysis methods of the blood glucose system include Lyapunov exponent and power spectral density based on the time series derived from the ...
Read More
In this paper, chaotic dynamic and nonlinear control in a glucose-insulin system in types I diabetic patients and a healthy person have been investigated. Chaotic analysis methods of the blood glucose system include Lyapunov exponent and power spectral density based on the time series derived from the clinical data. Wolf's algorithm is used to calculate the Lyapunov exponent, which positive values of the Lyapunov exponent mean the dynamical system is chaotic. Also, a wide range in frequency spectrum based on the power spectral density is also used to confirm the chaotic behavior. In order to control the chaotic system and reach the desired level of a healthy person's glucose, a novel fuzzy high-order sliding mode control method has been proposed. Thus, in the control algorithm of the high-order sliding mode controller, all of the control gains computed by the fuzzy inference system accurately. Then the novel control algorithm is applied to the Bergman's mathematical model that is verified using the clinical data set. In this system, the control input is the amount of insulin injected into the body and the control output is the amount of blood glucose level at any moment. The simulation results of the closed-loop system in various conditions, along with the performance of the control system in disturbance presence, indicate the proper functioning of this controller at the settling time, overshoot and the control inputs.
Software Engineering and Information Systems
Hamidah Ibrahim; Fatimah Sidi; Nur Izura Udzir; Poh Kuang Teo
Volume 4, Issue 4 , November 2018, , Pages 255-266
Abstract
Policy evaluation is a process to determine whether a request submitted by a user satisfies the access control policies defined by an organization. Modality conflict is one of the main issues in policy evaluation. Existing modality conflict detection approaches do not consider complex condition attributes ...
Read More
Policy evaluation is a process to determine whether a request submitted by a user satisfies the access control policies defined by an organization. Modality conflict is one of the main issues in policy evaluation. Existing modality conflict detection approaches do not consider complex condition attributes such as spatial and temporal constraints. An effective authorization propagation rule is needed to detect the modality conflicts that occur among the applicable policies. This work proposes a modality conflict detection model to identify the applicable policies during policy evaluation, which supports an authorization propagation rule to investigate the class-subclass relationships of a subject, resource, action, and location of a request and a policy. The comparison with previous work is conducted, and findings show the solution which considers the condition attribute (i.e. spatial and temporal constraints) can affect the decision as to whether the applicable policies should be retrieved or not which further affect the accuracy of the modality conflict detection process. Whereas the applicable policies which are retrieved for a request can influence the detection of modality conflict among the applicable policies. In conclusion, our proposed solution is more effective in identifying the applicable policies and detecting modality conflict than the previous work.
Software Engineering and Information Systems
Syed Atif Ali Shah
Volume 4, Issue 3 , August 2018, , Pages 135-142
Abstract
Abstract of Re-engineering of Industrial CMMI Through this research, I have established a general strategy to appraise an organization against a scale of five process maturity levels whilst maintaining a strong mechanics of CMMI. Reengineering of industrial CMMI proposes a novel method for Industrial ...
Read More
Abstract of Re-engineering of Industrial CMMI Through this research, I have established a general strategy to appraise an organization against a scale of five process maturity levels whilst maintaining a strong mechanics of CMMI. Reengineering of industrial CMMI proposes a novel method for Industrial Competence ranking of those organizations/companies which are targeting various CMMI levels. This approach uses SCAMPI assessment techniques to rank different organizations that fall below certain level of CMMI model. Furthermore, it adds the credulous factors, i.e., Score, Reliance and Confidence level for an organization’s capability and maturity evaluation. The benefit of proposed model is, that an organization can set its objectives to achieve target level of CMMI model, and it could be differentiated from less mature organizations at same level. This technique not only reclassifies the CMMI levels but also exposes various confidence factors.Index Terms— CMMI, Industrial Process Optimization, Process Engineering, Capability and Maturity Ranking, Product Quality Assurance.
Software Engineering and Information Systems
Negin Bagheri Renani; Elham Yaghoubi
Volume 4, Issue 3 , August 2018, , Pages 143-154
Abstract
Due to the increasing growth of processing cores in complex computational systems, all the connection converted bottleneck for all systems. With the protection of progressing and constructing complex photonic connection on chip, optical data transmission is the best choice for replacing with electrical ...
Read More
Due to the increasing growth of processing cores in complex computational systems, all the connection converted bottleneck for all systems. With the protection of progressing and constructing complex photonic connection on chip, optical data transmission is the best choice for replacing with electrical interconnection for the reason of gathering connection with a high bandwidth and insertion loss on chip was mentioned. Optical routers play an important role in the Optical Network-on-Chip (ONoC), which are responsible for selecting the path between optical signal source and the destination. In recent years, silicon optical routers based on Micro-Ring Resonators (MRRs) and Mach-Zehnder Interferometers (MZIs) have been proposed. The design of optical switches is desirable by using of Mach-Zehnder Interferometer. This is while that Micro Ring Resonator Switches have low bandwidth, whereas Mach-Zehnder Interferometer switches have wide bandwidth inherently. Mach-Zehnder Interferometer switches are able to routing with high speed for data transmission with Nano second switching time. This is while, that MRR switches in compare to MZIs has the less power consumption and area consumption. On the other hand we can divide optical routers into parts, A. general router and B. specific- router, so that in specific routers, some of I/O paths for the reason of avoiding deadlock had be omitted. In continue, several kinds of optical router based on MZI and MRR along with researching a series of parameters was mentioned.
Software Engineering and Information Systems
Meysam Dolatkia; Sahar Adabi; Ali Rezaee
Volume 4, Issue 3 , August 2018, , Pages 193-208
Abstract
The idea of automatic generation of data entry forms based on data relational models is a common and known idea that has been discussed day by day more than before according to the popularity of agile methods in software development accompanying development of programming tools. One of the requirements ...
Read More
The idea of automatic generation of data entry forms based on data relational models is a common and known idea that has been discussed day by day more than before according to the popularity of agile methods in software development accompanying development of programming tools. One of the requirements of the automation methods, whether in commercial products or the relevant research projects, could be the concept of metadata as a mediator between database and data entry forms. The metadata usually includes some schemas and constraints of target database, which could be used as a model for automatic generation of data entry forms. However, the most metadata models proposed in relevant researches have simple and undetailed structure. In other words, only the initial requirements of data entry are included in their contents. In this study, the main objective is to emphasize structure of metadata to discuss its enrichment methods to cover more requirements of data entry. In this regard, some parts of a metadata model are also presented for objectification of the ideas.
Software Engineering and Information Systems
Zahra Baatmaanghelich; Ali Rezaee; Sahar Adabi
Volume 4, Issue 1 , February 2018, , Pages 1-6
Abstract
One of the areas with greatest needs having available information at the right moment and with high accuracy is healthcare. Right information at right time saves lives. Healthcare is a vital domain which needs high processing power for high amounts of data. Due to the critical and the special characteristics ...
Read More
One of the areas with greatest needs having available information at the right moment and with high accuracy is healthcare. Right information at right time saves lives. Healthcare is a vital domain which needs high processing power for high amounts of data. Due to the critical and the special characteristics of these systems, formal methods are used for specification, description and verification. The goal of this research is to turn a business process graphical diagram into a formal based model. In this work, BPMN has been extended to add time and probability information and then has been transferred to probabilistic real-time CSP area. This mapping can be employed as a basic model for modeling different system characteristics. This mapping, then, is modeled using a case study in pervasive healthcare domain and verified in a model checking tool. Index Terms — Formal methods, CSP, BPMN, Pervasive healthcare, Model checking, Verification, Service Composition.
Software Engineering and Information Systems
Amin Moradbeiky; Amid Khatibi Bardsiri
Volume 4, Issue 1 , February 2018, , Pages 7-12
Abstract
Software project management has always faced challenges that have often had a great impact on the outcome of projects in future. For this, Managers of software projects always seek solutions against challenges. The implementation of unguaranteed approaches or mere personal experiences by managers does ...
Read More
Software project management has always faced challenges that have often had a great impact on the outcome of projects in future. For this, Managers of software projects always seek solutions against challenges. The implementation of unguaranteed approaches or mere personal experiences by managers does not necessarily suffice for solving the problems. Therefore, the management area of software projects requires tools and means helping software project managers confront with challenges. The estimation of effort required for software development is among such important challenges. In this study, a neural-network-based architecture has been proposed that makes use of PSO algorithm to increase its accuracy in estimating software development effort. The architecture suggested here has been tested by several datasets. Furthermore, similar experiments were done on the datasets using various widely used methods in estimating software development. The results showed the accuracy of the proposed model. The results of this research have applications for researchers of software engineering and data mining.
Software Engineering and Information Systems
Ramin Saljoughinejad; Vahid Khatibi
Volume 4, Issue 1 , February 2018, , Pages 27-40
Abstract
The literature review shows software development projects often neither meet time deadlines, nor run within the allocated budgets. One common reason can be the inaccurate cost estimation process, although several approaches have been proposed in this field. Recent research studies suggest that in order ...
Read More
The literature review shows software development projects often neither meet time deadlines, nor run within the allocated budgets. One common reason can be the inaccurate cost estimation process, although several approaches have been proposed in this field. Recent research studies suggest that in order to increase the accuracy of this process, estimation models have to be revised. The Constructive Cost Model (COCOMO) has often been referred as an efficient model for software cost estimation. The popularity of COCOMO is due to its flexibility; it can be used in different environments and it covers a variety of factors. In this paper, we aim to improve the accuracy of cost estimation process by enhancing COCOMO model. To this end, we analyze the cost drivers using meta-heuristic algorithms. In this method, the improvement of COCOMO is distinctly done by effective selection of coefficients and reconstruction of COCOMO. Three meta-heuristic optimization algorithms are applied synthetically to enhance the process of COCOMO model. Eventually, results of the proposed method are compared to COCOMO itself and other existing models. This comparison explicitly reveals the superiority of the proposed method.
Software Engineering and Information Systems
Mina Sadat Mousavi Kasravi; Mohammad Ahmadinia; Abbas Rezaiee
Volume 3, Issue 2 , May 2017, , Pages 65-74
Abstract
E-readiness is one of the major prerequisites for effective implementation of e-government. For the correct implementation of e-government, it is needed to accurately assess the state of e-readiness in desired community. In this regard, there are models to assess, but the correct choice of model is one ...
Read More
E-readiness is one of the major prerequisites for effective implementation of e-government. For the correct implementation of e-government, it is needed to accurately assess the state of e-readiness in desired community. In this regard, there are models to assess, but the correct choice of model is one of the most important challenges in this area. The process of evaluating and selecting the appropriate options in the implementation of e-government due to the involvement of different groups of decision-makers, existence of interrelationships between technology and desired community as well as existing platforms is a complex process. In recent decades, with access to computational methods and powerful decision making systems selecting more accurate options, effective analysis of qualitative and quantitative characteristic and studying the interaction between them are provided. This article tries to examine the performance of e-readiness assessment models and multi criteria decision making methods and introduces the best selection of the e-readiness model for effective implementation of e-government. In order to reach this purpose, we introduced a layered architecture based on multi-criteria decision making methods and SWOT Analysis. The proposed layered architecture, reduces decision making errors and increases the accuracy in choosing the appropriate e-readiness assessment model.
Software Engineering and Information Systems
Vida Doranipour
Volume 3, Issue 2 , May 2017, , Pages 107-112
Abstract
Nowadays, effort estimation in software projects is turned to one of the key concerns for project managers. In fact, accurately estimating of essential effort to produce and improve a software product is effective in software projects success or fail, which is considered as a vital factor. Lack of access ...
Read More
Nowadays, effort estimation in software projects is turned to one of the key concerns for project managers. In fact, accurately estimating of essential effort to produce and improve a software product is effective in software projects success or fail, which is considered as a vital factor. Lack of access to satisfying accuracy and little flexibility in existing estimation models have attracted the researchers’ attention to this area in last few years. One of the existing effort estimation methods is COCOMO (Constructive Cost Model) which has been taken importantly as an appropriate method for software projects. Although COCOMO has been invented some years ago, it has still got effort estimation ability in software projects. Many researchers have attempted to improve effort estimation ability in this model by improving COCOMO operation; but despite many efforts, COCOMO results are not satisfying yet. In this research, a new compound method is presented to increase COCOMO estimation accuracy. In the proposed method, much better factors are gained using combination of invasive weed optimization and COCOMO estimation method in contrast with basic COCOMO. With the best factors, the proposed model’s optimality will be maximized. In this method, a real data set is used for evaluating and its operation is analyzed in contrast to other models. Operational parameters improvement is affirmed by this model’s estimation results.
Software Engineering and Information Systems
Abdolghader pourali
Volume 3, Issue 1 , February 2017, , Pages 1-10
Abstract
Software architecture is one of the most fundamental products in the process of software development in the areas of behavioral or non- behavioral features like availability or transformability change. There are different ways to evaluate software architecture one of which is the creation of application ...
Read More
Software architecture is one of the most fundamental products in the process of software development in the areas of behavioral or non- behavioral features like availability or transformability change. There are different ways to evaluate software architecture one of which is the creation of application model. An executable model of software architecture is an official description of architecture which if used prior to the running of architecture, its final behavior and function will be observed and as a result possible problems could be elevated and promoted. In this study we aimed at availability evaluation in object- oriented style. To ensure the applicability of the style the UML diagrams, especially the sequence diagram, were used to exhibit the architectural behavior. In the later stages, as the UML diagram is inapplicable, the following operations were done. First, metric annotation is used to tag clichés. Then, the studied style diagram was transformed into an applicable one. Afterwards and following the design of petri, using CpnTools, the applicable model based on color petri net was evaluated. In this research the availability evaluation on an ATM for the N=5 users was tested and the results of evaluation showed that the higher the rate of availability (approximately %100) the higher is the rate of usability of the system when needed.
Software Engineering and Information Systems
mehdi Sadeghzadeh
Volume 3, Issue 1 , February 2017, , Pages 31-38
Abstract
In data grid, using reservation is accepted to provide scheduling and service quality. Users need to have an access to the stored data in geographical environment, which can be solved by using replication, and an action taken to reach certainty. As a result, users are directed toward the nearest version ...
Read More
In data grid, using reservation is accepted to provide scheduling and service quality. Users need to have an access to the stored data in geographical environment, which can be solved by using replication, and an action taken to reach certainty. As a result, users are directed toward the nearest version to access information. The most important point is to know in which sites and distributed system the produced versions are located. By selecting a suitable place for versions, the versions having performance, efficiency and lower access time are used. In this study, an efficient method is presented to select the best place for those versions created in data grid by using the users’ firefly algorithm which is compared with cooling algorithm. Results show that firefly algorithm has better performance than others.This means firefly algorithm is better and more accurate than genetic algorithm and particle swarm optimization in data replication task.
Software Engineering and Information Systems
Fatemeh Eghrari Solout; Mehdi Hosseinzadeh
Volume 2, Issue 4 , November 2016, , Pages 1-8
Abstract
One of the most important issues related to knowledge discovery is the field of comment mining. Opinion mining is a tool through which the opinions of people who comment about a specific issue can be evaluated in order to achieve some interesting results. This is a subset of data mining. Opinion mining ...
Read More
One of the most important issues related to knowledge discovery is the field of comment mining. Opinion mining is a tool through which the opinions of people who comment about a specific issue can be evaluated in order to achieve some interesting results. This is a subset of data mining. Opinion mining can be improved using the data mining algorithms. One of the important parts of opinion mining is the sentiment analysis in social networks. Today, the social networks contain billions of users' comments about different issues. In previous researches in this area, various methods have been used for Persian comments analysis. In these studies, preprocessing is one of the most important parts. It arranges the data set for analysis in a standard form. The number of hashtags selected for analysis is limited. To detect the positive and negative comments, knowledge extraction or neural network techniques have been used. The current research presents a method of analysis which can analyze any hashtag for each group of users and has no limitations in this regard. Type of hashtag, the number of likes, type of user and type of positive and negative sentences can be analyzed by this method. The results of simulation and comparison of divorce data set show that the proposed method has an acceptable performance.
Software Engineering and Information Systems
Faranak Mireskandari; Ramin Nasiri; Gholamreza Latif Shabgahi
Volume 2, Issue 4 , November 2016, , Pages 49-55
Abstract
the telecommunications industry plays an important role in providing ICT services to a wide range of customers. In addition to individual customers, corporate customers also are user of these services and have an important role to make return on investment for telecom companies (Telcos). Therefore, this ...
Read More
the telecommunications industry plays an important role in providing ICT services to a wide range of customers. In addition to individual customers, corporate customers also are user of these services and have an important role to make return on investment for telecom companies (Telcos). Therefore, this group of customers should not be ignored by any reason. This is where the Telecom Companies provide special services that named B2B to these customers. The Business Process Framework eTOM is proposed as a telecom. Framework to standardize and mature B2B processes by a separate section called Engaged Parties. In this paper, by using the ITSM Reference Model, we aim to improve the B2B processes in the business process framework already named item. Hereby, considering the ever-increasing Demands and needs of customers (in this paper customers mostly are Enterprises and Companies), and declaring the power inherent while using Customer Relationship Processes of ITSM Reference Model, we aim to complete B2B processes of the eTOM framework while focusing on Telcos.
Software Engineering and Information Systems
Vahid Khatibi Bardsiri; Mahboubeh Dorosti
Volume 2, Issue 2 , May 2016, , Pages 11-22
Abstract
One of important aspects of software projects is estimating the cost and time required to develop projects. Nowadays, this issue has become one of the key concerns of project managers. Accurate estimation of essential effort to produce and develop software is heavily effective on success or failure of ...
Read More
One of important aspects of software projects is estimating the cost and time required to develop projects. Nowadays, this issue has become one of the key concerns of project managers. Accurate estimation of essential effort to produce and develop software is heavily effective on success or failure of software projects and it is highly regarded as a vital factor. Failure to achieve convincing accuracy and little flexibility of current models in this field have attracted the attention of researchers in the last few years. Despite improvements to estimate effort, no agreement was obtained to select estimation model as the best one. One of effort estimation methods which is highly regarded is COCOMO. It is an extremely appropriate method to estimate effort. Although COCOMO was invented many years ago, it enjoys the effort estimation capability in software projects. Researchers have always attempted to improve the effort estimation capability in COCOMO through improving its structure. However, COCOMO results are not always satisfactory. The present study introduces a hybrid model for increasing the accuracy of COCOMO estimation. Combining bee colony algorithm with COCOMO estimation method, the proposed method obtained more efficient coefficient relative to the basic mode of COCOMO. Selecting the best coefficients maximizes the efficiency of the proposed method. The simulation results revealed the superiority of the proposed model based on MMRE and PRED(0.15).
Software Engineering and Information Systems
Behrouz Sadeghi; Vahid Khatibi Bardsiri; Monireh Esfandiari; Farzad Hosseinzadeh
Volume 1, Issue 4 , November 2015, , Pages 15-24
Abstract
One of the most important and valuable goal of software development life cycle is software cost estimation or SCE. During the recent years, SCE has attracted the attention of researchers due to huge amount of software project requests. There have been proposed so many models using heuristic and meta-heuristic ...
Read More
One of the most important and valuable goal of software development life cycle is software cost estimation or SCE. During the recent years, SCE has attracted the attention of researchers due to huge amount of software project requests. There have been proposed so many models using heuristic and meta-heuristic algorithms to do machine learning process for SCE. COCOMO81 is one of the most popular models for SCE proposed by Barry Boehm in 1981. However COCOMO81 is an old estimation model, it has been widely used for the purpose of cost estimation in its new forms. In this paper, the Imperialism Competition Algorithm (ICA) has been employed to tune the COCOMO81 parameters. Experimental results show that in the separated COCOMO81 dataset, ICA can estimate the COCOMO81 model parameters such that the performance parameters are significantly improved. The proposed hybrid model is flexible enough to tune the parameters for any data sets in form of COCOMO81.
Computer Architecture and Digital Systems
Reza Bahri; Hossein Yarmohammadi; Mohammad Reza keshavarzi; Gholamreza Moradi
Volume 1, Issue 2 , May 2015, , Pages 23-28
Abstract
This paper presents the design and simulation of a typical Ka band satellite beacon receiver for propagation experimentation. Using satellite beacon signal as a reference signal in satellite wave propagation study, is one of the most important methods. Satellite beacons are frequently available for pointing ...
Read More
This paper presents the design and simulation of a typical Ka band satellite beacon receiver for propagation experimentation. Using satellite beacon signal as a reference signal in satellite wave propagation study, is one of the most important methods. Satellite beacons are frequently available for pointing large antennas, but such signals can be used for measuring the effect of natural phenomena such as atmospheric gases, water vapor, oxygen molecules, clouds, rain, dust and fog existing in different layers of the atmosphere, including troposphere and ionosphere.In recent years, different designs of satellite beacon receiver (analog and digital) are proposed and implemented. Beacon signals are used for various applications including precise orientation of the earth station to the satellite, automatic frequency control and satellite propagation research. These cases demonstrate the importance of using the reference beacon signal. To receive satellite beacon, an appropriate receiver is needed. Locking on satellite beacon signal is hard. Though conventional satellite receivers are also generally able to track the beacon signal, the nature of the signal is led to create a special receiver. In this paper, the design and simulation of a satellite signal beacon receiver have been done in Ka band.