Hi, first time posting. Sorry if posted in wrong Forum.
so far i've been trying my new hardware out on quite a few games. noticed on ARK that the optimize setting maxes everything out. yes it can run it with everything maxed out but i'm lucky if i can pass 15-20FPS... not sure why the optimizer is going for the maxed out settings.
Currently/At time of this post---
GPU - MSI AMD Radeon R9 390 8GB Gaming Edition
CPU - AMD FX-8350 Black Edition Vishera 8-Core 4.0 GHz Socket AM3+ 125W (Over Clocked & Water Block Cooled with Push-Pull Setup)
RAM - G.SKILL Sniper Gaming Series 16GB (2 x 8GB) 240-Pin DDR3 1866 (PC3 14900)
MOBO - ASUS ROG Crosshair V Formula-Z AM3+
PSU - LEPA G1200-MA 1200W
HDD/SSD/Storage - Mushkin Enhanced Striker 480GB SATA III
Monitor - HP LA2205WG 22" (DisplayPort with max Res. at 1680x1050)
My Current Settings of the Game(Only Non-Maxed/Epic settings will be showed)
Distant Field Ambient Occlusion - Off
General Shadows - Medium (this is the biggest performance difference)
Sky Quality - 0-30%(This one varies with each setting)
Terrain Shadows - Medium
View Distance - High
World Tile Buffers - High
(Not Showed in Raptr)
Resolution Scale - About 80%
with these settings i'm usually getting about 25-40FPS depending on the area, clutter of buildings/animals/Rock/Trees and so on.
Granted i understand the game is still beta, ARK has DRASTICALLY made great improvements on performance of the game since a few months ago, but the optimization is not accurate unless it's being based off of something i'm not seeing. the Base Resolution of the actual screen does not affect this game either. i can put it all the way down to 640x480 and it'd run the same as my max Resolution. this may be affected later on, but as of now i noticed it does nothing on my end and i tested it on a couple different lower end cards.
Main Reasoning of this is mainly feedback, input and to wonder if it's driver related and/or game related