Home
Up

Vol. 25 - Year 2000 Abstracts

No. 1

J. Brzeziński, M. Szychowiak:

Self-stabilization in distributed systems – a short survey

Abstract

Self-stabilization is a very interesting and promising research field in computing science. This is due to its guarantees of automatic recovery from any transient failure, without any additional effort. This paper presents an overview of self-stabilizing distributed algorithms. First, the outlook of the self-stabilization paradigm is shown, followed by a simple example and some formal definitions. Then, characteristics of stabilization types are described. Finally, the paper presents several self-stabilizing algorithms and further lines of investigation strive for distributed systems.

K. Chakrabarty:

Bags with interval counts

Abstract

In the present paper, we consider the concept of bags as introduced by Yager and define the notion of bags with interval counts(IC-Bags). It is observed that in certain situations, where the count of elements are not fixed and are represented in the form of intervals of positive integers, then, in those cases, IC-Bags could be of importance. Consequently, some databases may require the IC-Bag representation. Some operations on IC-bags are being studied and some propositions are proved. We show that IC bags could be used while dealing with certain types of decision analysis problems.

N. Belacel, R. M. Boulassel:

The use of fuzzy assigment method to make grading of bladder cancer malignancy using the features generated by means of computer assisted image analysis

Abstract

We recently developed a new fuzzy classification method named PROAFTN, which uses the multicriteria decision aid approach. The aim of this paper is to evaluate the performance of the proposed method to grade bladder cancer malignancy. For this purpose, 292 cases of bladder tumors, classified according to the old World Health Organization classification by a pathologist on three subjective levels of malignancy (137 low grade, 124 intermediate grade and 31 high grade) were tested using the 10-fold cross-validation technique. The features were generated by means of computer assisted microscope analysis of cell images and submitted to the PROAFTN method which determines the membership degrees of each case in each grade. In order to determine the accuracy of the classification, results obtained by the method were compared to the subjective grading made by the pathologist. The PROAFTN method yielded good results in terms of discrimination between low and high grades, while it was unable to provide a satisfying discrimination within the heterogeneous intermediate grade II group. These results seem to be in agreement with the literature concerning the clinical heterogeneity of the intermediate grade. From these results, it will be essential in the future to see whether any combination of other sets of features, such as clinical data, can better discriminate between the grades.

M. Iqbal:

Spline regularization of numerical inversion of Mellin transform

Abstract

There is described a method for inverting the Mellin transform. In the method there are applied the expansion in Laguerre polynomials, the conversion to the Laplace transformation and a convolution integral equation. The quality of the method is testified for ill-posed problems considered in the literature.

No. 2

I. Masłowska, D. Weiss:

JUICER - a data mining approach to information extraction from the WWW

Abstract

We present a novel approach to automatic text mining on the World Wide Web. Considering the fact that the enormously dynamic growth of the WWW results in a need for new, more powerful information extraction tools we designed and implemented a system, which adapts techniques originally introduced in the field of data mining. We believe that similar systems, which usually base on machine learning or natural language processing methods, can prove to be ineffective when dealing with the very large numbers of hypertext documents of different structure and subject. Moreover, such systems tend to treat HTML documents as plain texts not taking into account the additional information contained in their markup tags.

G. E. Antoniou:

Minimal circuit and state space realization for generalized Kelly-Lochbaum Discrete 2-D filters

Abstract

A circuit realization, for generalized Kelly - Lochbaum discrete two - dimensional (2-D) filters, with minimum number of delay elements is proposed. Using the proposed circuit implementation the corresponding state space realization is derived from its block diagram representation. The dimension of the 2-D state space vector is minimal and the corresponding 2-D transfer function is characterized by the all-pass property.

J. Brzeziński, M. Sajkowski:

The application of computational geometry algorithms to the timed verification of communication protocols

Abstract

In the paper we present the application of computational geometry algorithms to the exact determination of the times remaining to the occurrence of events in communication protocols, in the case when these times are expressed in the form of simple (i.e., in general, non-convex) polygons (this form of time is used in the polygon time structure). This exact solution to the problem may be better than the approximate (i.e. interval) one, especially as we know, that the application of the approximate solution may lead to the evaluation of a protocol as incorrect, which in fact is correct.

R. Susmaga:

New test for inclusion minimality in reduct generation

Abstract

The paper addresses the problem of generating reducts, i.e. minimal subsets of attributes that satisfy pre-defined consistency and minimality conditions. The main problem with the reduct generating process is its high computational complexity. This paper describes a breadth-first search algorithm to reduct generation that is based on the notion of discernibility matrix. The most time consuming part of the algorithm is a test for inclusion minimality that has to be applied to every potential reduct. As it has been shown, implementation of this minimality test determines strongly the behaviour of the whole algorithm. A number of different minimality tests has been presented and computationally evaluated in [33]. This paper is in a sense its continuation in that it introduces further improvements to the minimality tests. It also presents results of a set of experiments with non-trivial real-life data sets in which the new tests have been compared with their earlier implementations.

No. 3

G.E. Antoniou:

2-D Quadratic continued fraction expandable systems: minimal state space realization

Abstract

In this paper two new two--dimensional (2-D) continued fraction expansions, having quadratic structure, are presented. Using their circuit realizations, which in fact are characterized by the minimum number of delay elements, minimal state space representations are derived. The matrices A, b, c of the 2-D state space model are presented in generalized closed form. Two examples are given to illustrate the new realizations.

F. Glineur:

An extended conic formulation for geometric optimization

Abstract

The author has recently proposed a new way of formulating two classical classes of structured convex problems, geometric and lp-norm optimization, using dedicated convex cones. This approach has some advantages over the traditional formulation: it simplifies the proofs of the well-known associated duality properties (i.e. weak and strong duality) and the design of a polynomial algorithm becomes straightforward. In this article, we make a step towards the description of a common framework that includes these two classes of problems. Indeed, we present an extended variant of the cone for geometric optimization previously introduced by the author and show it is equally suitable to formulate this class of problems. This new cone has the additional advantage of being very similar to the cone used for lp-norm optimization, which opens the way to a common generalization.

T. Morzy, M. Wojciechowski, M. Zakrzewicz:

Clustering Sequences of Categorical Values

Abstract

Conceptual clustering is a discovery process that groups a set of data in the way that the intra-cluster similarity is maximized and the inter-cluster similarity is minimized. Traditional clustering algorithms employ some measure of distance between data points in n-dimensional space. However, not all data types can be represented in a metric space, therefore no natural distance function is available for them. We address the problem of clustering sequences of categorical values. We present a measure of similarity for the sequences and an agglomerative hierarchical algorithm that uses frequent sequential patterns found in the database to efficiently generate the resulting clusters. The algorithm iteratively merges smaller, similar clusters into bigger ones until the requested number of clusters is reached.

C. Spathis, M. Doumpos, C. Zopounidis:

Detecting falsified financial statements using multicriteria analysis: the case of Greece

Abstract

This paper develops a model for detecting factors associated with falsified financial statements (FFS). A sample of 76 firms described over ten financial ratios is used for detecting factors associated with FFS. The identification of such factors is performed using a multicriteria decision aid classification method (UTADIS–UTilités Additives DIScriminantes). The developed model is accurate in classifying the total sample correctly. The results therefore demonstrate that the model is effective in detecting FFS and could be of assistance to auditors, to taxation, to Stock Exchange officials, to state authorities and regulators and to the banking system.  

Last changed on: 2008-05-19 by Bartłomiej Prędki