Monday 28 October 2019

Propositional Calculus Evaluation

Hi there, nothing new today, I've taken together, made it more usable and cleared a little bit some old code - parsing and evaluating of the logical expressions. I used an old tool: Reverse Polish Notation, where tokenizing, paring and evaluation are very simple. The whole file is about 195 lines of code. Unfortunately, the time complexity of the algorithm is exponential (in fact like any other algorithm for that, though the resolution method doing better for many types of formulas). How the tokenize, parse and eval functions works are described in greater details in many places, what I add is the fill_values function, which creates a whole table of logical values for the algorithm to check:

def fill_values(formula):
    """returns genexp with the all fillings with 0 and 1"""
    letters = "".join(set(re.findall("[A-Za-z]", formula)))
    for digits in product("10", repeat=len(letters)):
        table = str.maketrans(letters, "".join(digits))
        yield formula.translate(table)


It uses some Python features (that's why I like to use it, not for its efficiency, but because lots of libraries to quick get things done). The whole thing works something like that (asciinema screenacast):

https://asciinema.org/a/TarY4YQQTbm3To5ZpBXTci3Y4?t=8

Still lacks error handling, checks if a formula is syntactically correct. For example, it does this:

$ ./eval.py 
Propositional Logic Parser, press help, /h, -h or -usage for help
> q ~ a
Formula is satisfiable


I would like anybody doing some logic exercises, to try it out, or just for fun, that certainly will help:).

Code on Github . Out of the box here:
https://repl.it/@lion137/propositionalcalculluseval

 Thanks for patience, bye for now:)

Thursday 10 October 2019

Skew Heap In Java

Hello, just 've finished my Java implementation of a Skew Heap data structure and, also heap sort using this type of heap. It's very simple, under the hood structure is a binary tree, heap order is maintained via special merge function. Here is a basic structure:
 
public class SkewHeap <V extends Comparable<V>> {

    SkewNode root;
    public SkewHeap() {
        this.root = null;
    }

    class SkewNode {
        SkewNode leftChild, rightChild;
        V data;

        public SkewNode(V _data, SkewNode _left, SkewNode _right) {
            this.data = _data;
            this.leftChild = _left;
            this.rightChild = _right;
        }
    }
}
 
And here the crucial merge:

SkewNode merge(SkewNode left, SkewNode right) {
        if (null == left)
            return right;
        if (null == right)
            return left;
        if (left.data.compareTo(right.data) < 0 || left.data.compareTo(right.data) == 0) {
            return new SkewNode(left.data, merge(right, left.rightChild), left.leftChild);
        } else
            return new SkewNode(right.data, merge(left, right.rightChild), right.leftChild);
    }


It's doing exactly what's Wikipedia says: maintains structure and keeping minimum in the root. The others pop and insert use this procedure. Sorting is done via three functions:


SkewHeap listToHeap(List list) {
        SkewHeap o = new SkewHeap();
        for (int i = 0; i < list.size(); i++)
            o.insert((Comparable) list.get(i));
        return o;
    }

List heapToList(SkewHeap tree) {
        List ll = new ArrayList();
        try {
            while (true) {
                ll.add(tree.pop());
            }
        }
        catch (IndexOutOfBoundsException e) {
            return ll;
        }
    }

List sort(List list) {
        return heapToList(listToHeap(list));
    }

The first as is seen packs the list parameter to the heap, another stack the whole heap to the list (popping minimums), the resulted list must be sorted than, and finally the sort composes them two. Performance of sorting is O(nlog(n)), though is worse than standard Java sort. Code on github. Thank you.  
Sources: Wikipedia , What is Good About Haskell

Thursday 18 July 2019

AVL Tree in Java

I have not blogged for some time, but there is something new:  AVL Tree in Java. Code on github: https://github.com/lion137/Java .
Thanks, enjoy! 

Monday 4 March 2019

Fundamental Algorithms: Parse to RPN

Another one, Parsing to Reverse Polish Notation. Why it's so important? For example, typing to the first calculators, one had to do it in RPN! It's very easy to find a value of an expression in RPN. Deadly simple, having one, like:
3 2 + 4 3 * +
It's enough to take en empty LIFO stack, go through it and:
1 When we see a number push it on stack.
2. Seeing operand, pop stack twice, perform the operation and push result.
3. When done pop stack - this is result!

Parsing is not such easy, but is explained in details, for example, here.
Code on github.
Whole Series.

Thank you!

Fundamental Algorithms: Links

Links to whole series:
1. Order Statistics - C++
2. Polynomial GCD, Polynomial Division - Python
3. Recursive Permutations, All the Subset of Set and Variations With Repetitions - C++
4.Parsing (and evaluating) to RPN

Wednesday 27 February 2019

Fundamental Algorithms III

Three more fundamental algorithms in the series. Today, recursive permutations, subsets of the given set and variations with repetitions. The all written in C++ (non generic, I used std::vector). Code (hope it's self explanatory, any questions ask). 
Whole series.
Thank you.

Sunday 3 February 2019

Fundamental Algorithms: Polynomial GCD

Another one in the series on fundamental algorithms; today: polynomial division. We assume, that polynomial is over some field, this is good thing, we have lots of nice properties in those; explicitly, if a and b are in the field and b is not zero, than there always exists an element of the field, c, such that a = bc.
So, division is easy, just "push it" over the field, like division in base, or multi precision division.
Algorithm, Knuth "The Art Computer Programming Vol 4" and the others:

# input: coefficients of polynomials as a lists u, v != 0
# in order u[0] = free coefficient of u(x), ....
# returns polynomials q, r such that u = vq + r(rest)
# length u >= length v >= 0
def poly_divide(u, v):
 m =   len(u) - 1
 n =   len(v) - 1
 q = [0] * (m - n + 1)
 lim = m - n
 for k in range(lim, -1, -1):
  q[k] = u[n + k] / v[n]
  for j in range(n + k - 1, k - 1, -1):
   u[j] = u[j] - q[k] * v[j - k]
 return q, u[:n]

Having division, we can compute the Great Common Divisor, showing, that Euclidean Domain is larger than expected:

def poly_gcd(u, v):
 if not any(v): return u
 else: return poly_gcd(v, poly_mod(u, v))

poly_mod  it's just a rest returned from division algorithm. I think it's beautiful, don't you?
The code, as usually on github , (division.py). Thank you, see you next time!

Friday 25 January 2019

Fundamental Algorithms

I'm starting a new series (will see how it 'll go:)) about some very basic algorithms implemented in C++. The first, order statistics . I use classic divide and conquer strategy, with random pivot (it's not the best, but reasonable choice, I think it gives nearly amortised linear time). As a bonus, there is quicksort algorithm - uses the same partition function. Code (partition and order statistic: quick_select here):

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
int partition(int a[], int p, int r) {
 int t = a[r];
 int i = p - 1;
 for (int j = p; j < r;j++){
  if (a[j] <= t) {
   i += 1;
   std::swap(a[i], a[j]);
  }
 }
 std::swap(a[i + 1], a[r]);
 return i + 1;
}

int quick_select(int a[], int s, int n, int k) {
  int r = partition_random(a, s, n);
  if (r - s == k - 1) {return a[r];}
  else {
  
  if (k - 1 < r - s) {
   return quick_select(a, s, r - 1, k);
  }
  else  {
   return quick_select(a, r + 1 , n, k - r + s - 1);
  }
 }
}

int partition_random(int a[], int p, int r) {
 std::random_device rd;     
 std::mt19937 rng(rd());   
 std::uniform_int_distribution<int> uni(p, r); 
 auto rn = uni(rng);
 std::swap(a[r], a[rn]);
 return partition(a, p, r);
}

As seen, quick_select use partition to split initial array, according to some random index; and then recurs in certain part, depends if k - the #nth smallest is top or bottom part of the array.
Partition algorithm is commonly known, explanation can be found on the web; it returns number of elements less than the last element of the array, and mutates array in place (moving the all elements less then pivot, to the left). It could be use to compute, median:

1
2
3
4
5
6
7
8
float median(int a [], int n) {
 if (n % 2 == 1)
  return quick_select(a, 0, n - 1, n / 2 + 1);
 int l = n / 2;
 int r = n / 2 + 1;
 return (1.0 * (quick_select(a, 0, n - 1, l) + 
   quick_select(a, 0, n - 1, r))) / 2;
}
In C++ Standard Library, order statistics is named: nth_element. Code on github .

Thank you.

Sunday 6 January 2019

Kleisli Category by Example

My New Year resolution: studying category theory, I've started softly, reading this. From the book, there is an exercise, which is the subject of this post. I don't want to go into the details, anyone can read (strongly recommended), but shortly: 
A category is a structure consists of objects (for as they are primitives in the theory) and morphisms (not really are, but we can visualize them as functions) between them. And, for every object a there must be the identity morphism going from a to a; for every pair of morphisms going from a to b and from b to c, there must be the morphism, going from a to c which is a composition of the two morphisms and is associative

Thinking about programming languages, we can have category types and functions, where objects are types, functions are morphisms between types. Kleisli Category is such type and is constructed as follows: Let's say we have a category of types and functions, like Integer, Double, and morphisms, for example, int => int and so on. Let's say, that we want to make every such function (ex.: square root) total, it can be done by making the return value type of the function something like a pair: Double and String value: "Success" or "Fail", even better Boolean: True, False. So now, we have functions from int to pair. If we can make a category from these new objects and functions we have Kliesli Category. 

The language is C++, morphisms in category: T => optional<T, bool> - no surprise here:). Optional type class:
template<class A> class optional {
 bool _isValid;
 A _value;
 
 public:
 optional(): _isValid(false) {}
 optional(A e) : _isValid(true), _value(e) {}
 optional(A e, bool v) : _isValid(v), _value(e) {}
 A value() const { return _value; }
 bool validation() const {return _isValid;}
};
Pair with value and Boolean indicator, constructors, methods to get values. 
We deal with square root and reciprocal:
optional<double> safe_root(double x) {
 if (x >= 0) return optional<double>{std::sqrt(x)};
 else return optional<double>{};
}

optional<double> safe_reciprocal(double x) {
 if (! x == 0) return optional<double> {1 / x};
 else return optional<double>{};
}


As definition states, the job is to construct identity and composition. Identity seems easy:
optional<double> identity(double x) {
 return optional<double>{x};
}
If original identity would go from Double to Double, then a new must go from Double to optional and change the state of isValid to True. There is no logical alternative: True is neutral to conjuction (why conjuction in a moment) and if we don't change a value(the very heart of identity operation), then there still is a value, so indicator must be true.   

Composition, first we need a compose function (takes f and g and returns g○f - "g after f"), to do that, here is a bit of new C++ lambda syntax (thanks to Eric Niebler):
auto const compose = [](auto m1, auto m2) {
 return [m1, m2](auto x) {
 auto p1 = m1(x);
 auto p2 = m2(p1.value());
 return optional<double>(p2.value(), 
                    p1.validation() && p2.validation());
 };
};
Inside the lambda, p1 and p2 are responsible for the composition of two incoming functions. The value after the second evaluation and the logical conjunction of the two incoming optionals isValid fields is passed to the returned function. Is this really a composition? Well, must be: functions are composable, so check, the second isValid parts are composed by logical and operator which is, I hope, clear. To get a composition with a proper type, both functions validation fields must evaluate to true, if one or both are false, then the composition fails (in the term of a value type evaluation), setting validation to false. And it's also, associative - composition is associative.
All works as expected: 
auto const safe_root_reciprocal = compose(safe_reciprocal, safe_root);
auto const safe_rootIdR = compose(safe_root, identity);
auto const safe_rootIdL = compose(identity, safe_root);
std::cout << safe_root_reciprocal(0).validation() << "\n"; // -> 0
std::cout << safe_root_reciprocal(3).validation() << "\n"; // -> 1
std::cout << safe_root_reciprocal(3).value() << "\n"; // -> 0.57735
std::cout << (safe_rootIdR(2).value()) << "\n"; // -> 1.41421
std::cout << (safe_rootIdL(2).value()) << "\n"; // -> 1.41421

Left and right identities gave expected results, also composition safe_root_reciprocal does the job. Looks like it's done, one more time, Happy New Year!