Convolution Sum and Convolution Integral

Learning - Signals and Systems, Uncategorized

summaryOfConvolution

One of the most important operation in signals processing is convolution. For a typical LTI system, the output can be produced by convolving the input with the given impulse responses. In general, we have 2 types of convolutions:

  • Convolution sum for Discrete LTI system
  • Convolution Integral for Continuos LTI system

convolutionSum.PNG

convolutionIntegral.PNG

 

Here, we will see how to compute convolution integral. Basically, the convolution integral involves 4 steps as shown below:

convolutionIntegralStep.PNG

 

Let’s us look at the simple example, x(t) and h(t) is given below:

ci_Eg

So follow step 1 to use a dummy variable τ to replace t

ci_step1

then flip h(τ) as indicated in step 2

ci_step2

Repeating step 3 & 4 until obtaining the final answer

ci_step3-4

ci_step3-4(2)

 

ci_step3-4(3)

ci_step3-4(4)

so, the final answer is

finalAns

 

Similarly, we can also compute the convolution sum for Discrete LTI system by using the following 4 steps:

convolutionSumStep.PNG

we will look at some related examples on computing convolution sum in the next post. Stay tuned.

A simple UWP App that will speak based on user’s input

Learning - UWP, Uncategorized

Universal Windows Platform (UWP) allows you to develop the App with one API set for all types of devices with different screen resolution. Furthermore, you can code the UWP app using your own preferred language.

For this tutorial, we will use C# to create a simple UWP App. The basic function of the App is just to take the text input from the user and speak it out. We will create the App using the Visual Studio 2017, if you do not have the Visual Studio installed, you can download the community version here.

First, open the visual studio, then go to the File -> New -> Project

01. New Project

A New Project dialog will be pop-up. Under the templates, choose Visual C# and then Windows Universal. You will find 4 options available on the right panel, choose “Blank App”. After that, give a name to your App, e.g., “inputAndSpeak”. Leave the rest as default, and click OK.

02. Project Template

Once you clicked OK, a dialog will pop-up asking you for the target version and minimum version, just leave them as default and click OK to proceed.

03. Default Selection

 

You should have a Solution Explorer which contains all the necessary files for the application development. We will focus on the MainPage.xaml. So under the Solution Explorer, locate the MainPage.xaml, and double-click on it.

04. MainPageXAML

The MainPage.xaml will be opened. It contains a splits views contain both design view and code view. You can arrange the splits views in a top-down or left-right layout. On top of the design view, you can find the device model the design view is catered for. Change the device model to 8″ tablet as follows:

05. SelectDevice

Then, open the Toolbox, and drag a “Button” and a “TextBox” into the blank page area in the design view. Visual Studio will update the XAML code accordingly.

06. AddButtonTextBox

07. InsideTheGrid

In the code view, we can see that there are two items being added to the Grid, i.e., the Button and the TextBox. Along the TextBox, you can find all the related properties subject to how you placed your TextBox in the blank page. Besides that, you will also see the Text=”TextBox”,  if you run the program, you will see the string TextBox appear inside your TextBox element. To get rid of the default text string, just delete the “TextBox” and leave it empty as “”.

08. EmptyTheText

Add a name to the TextBox element, the name is used for us to refer to the TextBox element inside the C# code. Before the “>”, type Name=”userInput”.

09. GiveTextBoxAName

You can try to run your program by clicking the run button (or press F5). You can enter the text into the TextBox, but when you click the button, nothing is happening. We need to add code to the button when the user clicks the button. In other words, what we want the program to do when user trigger the clicking event. To add the code, simply double-click the button in the design view. Visual Studio will automatically generate a Button_Click event in the .cs file.

10. DoubleClickButton

However, there are no codes inside the Button_Click subroutine. First, let’s capture the text string inputted by the user:

11. stringInput

After that, let’s add a MediaElement to translate the text input to the audio output. Note that in order to use the MediaElement, you need to use the related library available in Windows API. To add the related library, just hover your cursor over the red curly line, and a lightbulb will pop-up asking you how you would like to fix the problem. Select “using Windows.Media.SpeechSynthesis;”. Alternatively, you can add the library manually.

12-mediaelement.jpg

15. UsingUIpopUP

Now, let’s add a conditional statement to check if the user got input anything in the provided TextBox, if not, a message dialog will be pop-up to remind the user to input the text. If the TextBox is not empty, which means the user had inputted some strings, then use the MediaElement to play the input text. Note that you need to insert the related library for the UI.Popups to enable the alert message dialog.

14. CheckIfUserInputSomething

15. UsingUIpopUP

Note that the SynthesizeTextToStreamAsync method is an asynchronous method, hence, you need to add “async” for your function. Add “async” before the void, as follows:

16. AddAsync

Congratulations! Now your program is able to speak according to the user’s text input. Try out your program by clicking the run button (the green button). Enjoy your program!

17. RunLocalMachine

How iBeacon and Eddystone shape the IoT Ecosystem

Learning - Beacon

Beacon has been a buzzword these years bombarding the IoT ecosystem. These beacons are tiny devices operated on top of Bluetooth Low Energy (BLE) technology, a technology to empower the IoT development as according to Bluetooth SIG. One should not confuse BLE with beacon: BLE is a technology, and the beacon is just a device realized by this technology. Or I should put it in this way: beacon is a subset of BLE. Now we are clear about BLE and beacon, but what exactly a beacon is? I believe you have also heard of iBeacon and Eddystone, but, what are they and how do they help to advance the IoT ecosystem? In this post, we will first review the beacon, and then, we will have a quick comparison between iBeacon and Eddystone. Hopefully, you can grasp the big picture regarding this tiny beacon and further unleash its potential for the IoT development. Ultimately, we hope that this post can help you to look beyond the current trend of the beacon, and start to think of the possible challenges that our market might face with such explosive beacons adoption in both public and private spaces.

ibeaconEddstyone trend.PNG

So, let’s talk about the beacons. Yeah, you can find a lot of beacon manufacturers these days and each of them promises to give you the best experience with certain specific features which are only available in their beacon. You can always check out their website to learn more about their beacon. But be aware and don’t get deceit by the astonish outlook design, in fact, the beacon is just a tiny device that advertises the BLE signals according to a pre-defined interval. The format of the bit stream encapsulated into the advertised signals is subject to the employed communication protocol. Yes, this is where the iBeacon and Eddystone came in. Simply said, iBeacon and Eddystone are just two different standards that define the format of the encapsulated bitstream. By inspecting the communication protocol (i.e., either iBeacon or Eddystone), the receiver can then decode the signal accordingly.

 

iBeacon

Well, now you should have some basic ideas regarding iBeacon. Again, please don’t confuse iBeacon with the beacon. Beacon is just a generic term used to describe the BLE device which operates in a broadcasting mode, a device being without a specified communication protocol. iBeacon, on the other hand, refers to a specific beacon device that is implemented with the communication protocol specified by Apple. The history of iBeacon can be traced back to December 2013, where Apple announced their iBeacon protocol during the Worldwide Developers Conference. Just as any Apple devices, the iBeacon is designed to work only with iOS devices; however, Android or any other devices can also interact with iBeacon as long as they are BLE-compatible. However, since it is a proprietary device by Apple, it works best in iOS devices comparatively. iBeacon only works with native apps, meaning that we need to install an app to interact with the iBeacon devices. Through the installed app, the device can acquire the UUID, major and minor encapsulated in the advertised signals for further interaction. Note that the major and minor are just a series a number ranging from 0-65535, they do not contain any multimedia content or interaction commands. The receiver always needs to refer the major and minor to a content management server to retrieve the corresponding content or further instruction. While some might consider this as one possible drawback of iBeacon, the major-minor, in fact, gave much greater flexibility in content management and delivery. Furthermore, there are many possibilities with major minor besides content mapping, and it allows us a greater flexibility to manipulate the content and adding in better intelligent elements. I am not sure if you got the idea that I would like to share here, please leave your comments if you are inspired. Share with us and the community if you come up a better way in using the iBeacon besides the push notification and content delivery which has been widely employed by retail industries.

 

Eddystone

Eddystone, similar to iBeacon, is just a baby belong to the same subset (i.e., beacon) of BLE. In other words, iBeacon is a big brother to Eddystone and Eddystone is two years younger than iBeacon. Following Apple’s footsteps, Google launched Eddystone in July 2015. While iBeacon only encapsulate UUID, major and minor into their advertising signals, Eddystone promises a wider spectrum in which the advertising signals can carry much richer data besides the UUID, major and minor (it is known as UID packet in Eddystone). The other 2 extra packets available through Eddystone is the URL and TLM packet. You can find a clean and clear description of this three Eddystone-specific packets in here. They also list down the differences between iBeacon and Eddystone in a straightforward manner. In fact, the concept of Eddystone-URL is similar to the QR code, it encapsulates a website address which you can open with your web browser. The major difference is how the users discover the website address. With QR code, users need to take the initiative to scan the QR code to access the web; whereas with Eddystone-URL, users will be notified by the app as they pass-by the installed Eddstyone. Web accessing is easier and hassle-free with the active advertisement by the Eddystone, compared to the passive mode of the QR code in which active involvement from the user is needed to retrieve the web address. In contrast to the proprietary iBeacon, Eddystone is an open protocol where developers can enjoy greater flexibility developing with Eddystone associated IoT.  So, does Google give you any inspiration with their open Eddystone protocol?

ibeaconEddstyone brother.png

iBeacon vs Eddystone

People are excited about the possibilities of beacons in enhancing the IoT ecosystem, especially with the active involvement from the two giants – Apple and Google. While the market is witnessing an explosive beacon adoption, our current market still lacking a mature platform to fully integrate the beacon and thus unleash its full potential. Such an explosive adoption by the marketers without a holistic consideration might probably lead to a serious problem affecting the healthy development cycle of the IoT ecosystem. So, instead of choosing iBeacon or Eddystone, or trying to come up a better protocol, have you ever think of the possible challenges that we might face in the near future with massive beacons in our proximity? How these beacons, be it iBeacon, Eddystone or your own defined beacon standard, are going to affect how we interact with the people and the IoT devices in our proximity? We do not have an answer for it now, but you can probably imagine the possible future where everyone (including you) is surrounded by a number of the known or unknown beacon. I am not sure how these are going to change the way we interact, but definitely, our proximity will no longer limit to merely people or device interaction but a cross-domain interaction between people, device and even the cyberspace.