+ 1

Precision to calculate Pi

I am trying to calculate pi with Taylor series: pi = 4 - 4/3 + 4/5 - 4/7 + 4/9 - 4/11 + ... But I have to find out how many terms I have to use to limit the error after a specific number of decimals. For example, how many terms in Taylor series should I use to have precision until the 10# decimal place in number pi? What is the formula for the error in this case?

21st Sep 2020, 3:34 AM
Lucas Kliemczak
Lucas Kliemczak - avatar
1 Odpowiedź
0
DAC I found out the error formula to calculate Pi. The formula is: error = 4/(2*k+1), Where k is the number of terms in the series. I also tested an algorithm in java as follows. import java.lang.Math; public class Program { public static void main(String[] args) { double pi=0; long k = 10000000; for(long i=0; i<k; i++){ pi += 4*Math.pow(-1,i)/(2*i + 1); } double error = 4./(2*k+1); System.out.println(pi); System.out.println(error); System.out.format("%.15f", error); } } https://code.sololearn.com/c2TmRPtDM3iE/?ref=app
21st Sep 2020, 1:30 PM
Lucas Kliemczak
Lucas Kliemczak - avatar