Tell me more ×
Stack Overflow is a question and answer site for professional and enthusiast programmers. It's 100% free, no registration required.

I have some doubts about the use of the conversions from std_logic_vector to signed/unsigned. I always use the conversion signed(...), unsigned(...), but when I try to use the conversions defined in the library numeric_std (to_signed, to unsigned), it doesn't work. Can someone explain to me why this happens? And why the conversion unsigned() and signed() works?

share|improve this question

2 Answers

Look at the package spec for numeric_std. You will find that to_signed converts to signed ... from integer.

As "vermaete" says, signed is a closely related type to std_logic_vector so you don't need a conversion function there.

share|improve this answer

Because the std_logic_vector and signed/unsigned types are closely related, you can use the typecast way to convert. So signed(a_std_logic_vector) and unsigned(a_std_logic_vector) are okay. However, the functions to convert are also defined in the standard.

Take a look at the VHDL FAQ. This is an old website from the days that newsgroups were still hot, but it still has plenty of good information about VHDL.

share|improve this answer

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.