Managing Memory Available for Spark Drivers
By default, the amount of memory allocated to Spark driver processes is set to a
        0.8 fraction of the total memory allocated for the runtime container. If you
      want to allocate more or less memory to the Spark driver process, you can override this
      default by setting the spark.driver.memory property in
        spark-defaults.conf (as described above).
