Skip to content

Signal vs Trace

Learning Objectives

At the end of this sub-unit, students should

  • appreciate the comparison between signal processing view and trace table.

Comparison

Recap the problem of finding the sum of square of odd number from before. In the previous sub-unit, we solved it with signal processing view. However, we can also attempt a more direct solution.

Sum of Squares of Odd

1
2
3
4
5
6
7
def sum_squares_odd(m):
  res = 0
  for n in range(1, m + 1):
    val = 2 * n - 1
    sqr = val * val
    res = res + sqr
  return res

If we try to construct the trace table for the code above, we will some interesting properties. Let us show this for m = 4.

Iteration res3 n3 val4 sqr5 res6
1 0 1 1 1 1
2 1 2 3 9 10
3 10 3 5 25 35
4 35 4 7 49 84

Here is the interesting part. If we exclude res from our trace table, draw a diagonal line from top left to bottom right, and flip the table on this line, we will get the following table.

Iteration 1 2 3 4
n3 1 2 3 4
val4 1 3 5 7
sqr5 1 9 25 49

Does this table looks familiar? Consider splitting each line into a list. Can you see the connection yet?

1
2
3
n   = [1, 2, 3 , 4 ]
val = [1, 3, 5 , 7 ]
sqr = [1, 9, 25, 49]

Each line can be obtained from the previous one using map. So the idea of signal processing view is related to the idea of a trace table. This is to be expected as we are solving the same problem. In trace table, we are looking what we need to do in each iteration as we are filling the table row by row. In signal processing view, we flip the table sideways and we have to think of the connection between rows.

This flipping the trace table sideways is also present even when using filter. Let us see how it looks.

Sum of Squares of Odd

1
2
3
4
5
6
7
8
def sum_squares_odd(m):
  res = 0
  for n in range(1, 2 * m + 1):
    if n % 2 == 1:
      val = n
      sqr = val * val
      res = res + sqr
  return res

The trace table for m = 4 is shown below.

Iteration res3 n3 n % 2 == 1 val5 sqr6 res7
1 0 1 True 1 1 1
2 1 2 False      
3 1 3 True 3 9 10
4 10 4 False      
5 10 5 True 5 25 35
6 35 6 False      
7 35 7 True 7 49 84
8 84 8 False      

If we ignore res and the expression n % 2 == 1, we will get the following table after the diagonal flip.

Iteration 1 2 3 4 5 6 7 8
n3 1 2 3 4 5 6 7 8
val5 1   3   5   7
sqr6 1   9   25   49

Here, the blanks are completely removed. In the context of signal processing view, this is achieved using filter.

1
2
3
n   = [1, 2, 3 , 4 , 5 , 6, 7 , 8]
val = [1,    3,      5 ,    7    ]
sqr = [1,    9,      25,    49   ]

Why Signal Processing View?

Since signal processing view is just another way problem solving, is there a real advantage of doing this? This is the same advantage as using a function. We do not care about implementation. Better yet, because the function is written by people who are smarter than us, it can be implemented much more efficiently than we can. How much more?

A simple way to improve the speed is by parallelizing the computation. This is easy to do for map as the application of each F on an element should not depend on the previous element. That means, each element is considered independently of each other. Of course this might not always be the case, but it will be the case if F is a straightforward function.

Similarly with filter, it may be done in parallel because we can check each element independently. More difficult to do is the accumulation function. It is possible to parallelize well if we add restrictions to the operations op.

In fact, this thinking is prevalent in the area of big data analysis. Dealing with big data requires efficient algorithm and by focusing on an efficient implementation of parallel map, filter, and accumulate, the processing can be done quickly. There is even a name for this kind of framework called MapReduce1.


  1. In this case, reduce is accumulate.