Read input (as command-line argument or stdin) as a full stop symbol (.
) followed by a series of octal characters. Convert this number (which is between 0 and 1) to a mixed-radix notation(based on pi) where the radices of each digit are that of each (base 10) digit of pi after the decimal point in hexadecimal + 16. The output should be represented using digits 0-9 and the first (radix - 10) lowercase English letters.
Round the output for its length to be twice the length of the input (excluding any periods).
Examples:
Input: .1
Output: .23
. The first digit is in base 17, and the second in base 20.
Input: .000
Output: .000000
All answers must correctly report the conversion of .14155
in their answer
code-challenge
but must have hit upon a failure of my ability to operate a suggestion dropdown. – hexafraction Jan 13 at 18:15